2021-04-12 09:00:00 2021-04-12 10:00:00 America/Indiana/Indianapolis Multivariate information measures Xueyan Niu, Ph.D. Candidate https://purdue-edu.zoom.us/j/98319872753?pwd=bExON000WjN1UUhFaU9TNjh1ZmhWZz09

April 12, 2021

Multivariate information measures

Event Date: April 12, 2021
Sponsor: Dr. Christopher Quinn & Dr. Joaquin Goni Cortes
Time: 9:00 am EDT
Location: https://purdue-edu.zoom.us/j/98319872753?pwd=bExON000WjN1UUhFaU9TNjh1ZmhWZz09
Priority: No
School or Program: Industrial Engineering
College Calendar: Show
Xueyan Niu, Ph.D. Candidate
Xueyan Niu, Ph.D. Candidate



Many important scientific, engineering, and societal challenges involve large systems of individual agents or components interacting in complex ways. For example, to understand the emergence of consciousness, we study the dendritic integration in neurons; to prevent disease and rumor outbreaks, we trace the dynamics of social networks; to perform complicated scientific experiments, we separate and control the independent variables. Collectively, the interactions between individual neurons/agents/variables are often non-linear, i.e., a subset of the agents jointly behave in a manner unlike the marginal behaviors of the individuals.

The goal of this thesis is to construct a theoretical framework for measuring, comparing, and representing complex interactions in stochastic systems. Specifically, tools from information theory, geometry, lattice theory, and linear algebra are used to identify and characterize higher-order interactions among random variables. 

We first propose measures of unique, redundant, and synergistic interactions for small stochastic systems using information projections for the exponential family. Their magnitudes are endowed with information theoretical meanings naturally since they are measured by the Kullback-Leibler divergence. We prove that these quantities satisfy various desired properties. 

We next apply these measures to hypothesis testing and network communications. We interpret the unique information as the two types of error components in a hypothesis testing problem. We analytically show that there is a duality between the synergistic and redundant information in Gaussian Multiple Access Channels (MAC) and Broadcast Channels (BC). We establish a novel duality between the partial information decomposition components for MAC and BC in the general case. 

We lastly propose a new concept of representing the partial information decomposition framework with random variables. We give necessary and sufficient conditions for the representation under the assumption of Gaussianity. 

This research has the potential to advance the fields of information theory, statistics, and machine learning by contributing novel ideas, implementing these ideas with innovative tools, and constructing new simulation methods.