Project Details
Projekt Print View

Robust Sequential Analysis in Networks

Subject Area Electronic Semiconductors, Components and Circuits, Integrated Systems, Sensor Technology, Theoretical Electrical Engineering
Term from 2019 to 2024
Project identifier Deutsche Forschungsgemeinschaft (DFG) - Project number 431431951
 
Sequential Analysis is concerned with statistical inference when the number of samples is not given a priori, but chosen based on the data observed so far. Sequential detectors have been shown to significantly reduce the average number of samples compared to equivalent fixed-sample-size detectors. They find application in many fields where a high efficiency is required, in particular in situations where either taking samples is expensive or detection delay is critical. In contrast, the idea underpinning robust statistics is to sacrifice some efficiency under ideal conditions in order to be less sensitive to deviations from the ideal case. That is, robust procedures are designed to perform well in a neighborhood of the assumed model, typically allowing for small, but arbitrary deviations. In our previous work, in particular the DFG project "Robust Sequential Analysis", we investigated the benefits of combining sequential and robust statistics in order to make fast yet statistically reliable decisions. With the Roseanne project, we plan to extend this line of research to distributed systems and networks. The latter are highly relevant for future communication and signal processing systems, in particular in the context of Smart Cities and the Internet of Things. Yet, very few results on robust sequential detection in networks can be found in the literature. Most importantly:1) There is no distributed equivalent to uncertainty models such as outliers or neighborhoods of probability distributions.2) Little is known about the relation between node-wise uncertainty and network-wide uncertainty.3) Techniques from robust centralized detection, such as clipping or censoring, do not reliably robustify distributed detection.4) There are no results on the amount of uncertainty a distributed detector can tolerate without breaking down.As a consequence, in the majority of existing works, robustness is defined in a purely qualitative manner and the results either only apply to small networks or are based on empirically motivated, application specific heuristics. The first aim of this project is to narrow this gap in understanding by providing a solid foundation for a theory of robust sequential detection in networks. Based on this foundation, the second aim is to develop and implement sequential distributed detection algorithms that are robust in a well-defined and quantifiable manner. A successful completion of the project would pave the way for a general and rigorous framework of robust distributed statistics, comparable to the existing body of work on centralized robust detection.The Signal Processing Group is internationally recognized for its work on robust statistics as well as sequential and distributed signal processing. Therefore, we see ourselves in a uniquely favorable position for a successful completion of the proposed project, which brings together these core areas of expertise.
DFG Programme Research Grants
Co-Investigator Dr.-Ing. Michael Fauß
 
 

Additional Information

Textvergrößerung und Kontrastanpassung