Conditional information dimension example

Lucy1234

New member
Joined
Jul 30, 2020
Messages
1
Given a random variable X that is uniformly distributed on [MATH][-b,b] [/MATH] and [MATH]Y=g(X) [/MATH] with
[MATH]g(x) = \begin{cases} 0, ~~~ x\in [-c,c] \\ x, ~~~ \text{else}\end{cases} [/MATH]
Now I want to compute the information dimension [MATH]d(X), d(Y) [/MATH] and the conditional information dimension [MATH]d(X|Y) [/MATH] and show that [MATH]d(X) = d(X|Y) + d(Y) [/MATH] in this case.

The information dimension is defined as
[MATH]d(X) = \lim_{m\rightarrow \infty} \frac{H(\hat{X}^{(m)})}{m} [/MATH]with
[MATH]\hat{X}^{(m)} := \frac{\lfloor2^m X \rfloor}{2^m} [/MATH]the quantization of X.

For a discrete distribution, [MATH]d(X) = 0 [/MATH], and for a continuous one-dimensional distribution, [MATH]d(X) = 1 [/MATH]. For a mixed distribution with discrete and continuous components of the form [MATH]P_X = d P_X^{(ac)} + (1-d) P_X^{(d)} [/MATH], the information dimension is [MATH]d(X)=d [/MATH].

Now I know, that the random variable X has a continuous component [MATH]\Rightarrow d(X) = 1 [/MATH]. The distribution [MATH]P_Y [/MATH] is a discrete-continuous mixture:

[MATH] P_Y = \begin{cases} \frac{c}{b}, ~~~Y=0\\ \frac{1}{2b},~~~Y \in [-b,-c] \cap [c,b]\\ 0,~~~\text{else} \end{cases} [/MATH]Therefore, [MATH]d(Y)=\frac{b-c}{b} [/MATH].

Now my question is the following: how do I compute the conditional information dimension?
[MATH]d(X|Y) = \lim_{m \rightarrow \infty} \frac{H(\hat{X}^{(m)}|Y)}{m} = \int_\mathcal{Y} d(X|Y=y)dP_Y(y) = \mathbb{E}_{Y\sim P_Y}(d(X|Y=Y)) [/MATH]
I have the formula, but I have no idea how to solve this example.
 
Top