On Phi-entropic Dependence Measures and Non-local CorrelationsWe say that a measure of dependence between two random variables $X$ and $Y$, denoted as $ρ(X;Y)$, satisfies the data processing property if $ρ(X;Y)\geq ρ(X';Y')$ for every $X'\rightarrow X\rightarrow Y\rightarrow Y'$, and satisfies the tensorization property if $ρ(X_1X_2;Y_1Y_2)=\max\{ρ(X_1;Y_1),ρ(X_2;Y_2)\}$ when $(X_1,Y_1)$ is independent of $(X_2,Y_2)$. It is known that measures of dependence defined based on $Φ$-entropy satisfy these properties. These measures are important because they generalize R{é}nyi's maximal correlation and the hypercontractivity ribbon. The data processing and tensorization properties are special cases of monotonicity under wirings of non-local boxes. We show that ribbons defined using $Φ$-entropic measures of dependence are monotone under wiring of non-local no-signaling boxes, generalizing an earlier result. In addition, we also discuss the evaluation of $Φ$-strong data processing inequality constant for joint distributions obtained from a $Z$-channel.
arXiv.org