Communication and Complexity Constrained Inference over Graphs for Big Data

15.11.2015 - 14.11.2019
Forschungsförderungsprojekt

 

Due to the automated collection of data in numerous domains, specifically via the Internet, humanity faces a data deluge with data volumes reaching the petabyte regime. The resulting challenges and opportunities are referred to as ''big data.''

The aim of this project is to develop foundations for extracting useful information from massive data sets. Our key hypothesis is that big data problems can be tackled by a combination of graph signal processing and distributed optimization techniques. Distributed optimization can cope with high volume and high acquisition speed, whereas graph signal processing provides a universal language for modeling and managing diverse and decentralized data. We build on the notion of "algorithmic weakening" to achieve a flexible tradeoff between statistical accuracy and resource constraints. The main scientific objectives are
(i) to devise and assess suitable inference and processing schemes for big data via statistical/compressive graph signal processing and (ii) to develop and analyze practical implementations based on distributed stochastic optimization. We anticipate that the project will lead to a novel, rich, and versatile framework that
is applicable to real-world problems like large-scale wireless networks, crowd sensing, and social media.

Personen

Projektleiter_in

Institut

Grant funds

  • WWTF Wiener Wissenschafts-, Forschu und Technologiefonds (National) Vienna Science and Technology Fund (WWTF) Call identifier ICT 2015

Forschungsschwerpunkte

  • Telecommunication: 100%

Publikationen