... | ... | @@ -220,7 +220,7 @@ In order to claim that i & j are independent, the general form of the equation m |
|
|
\rho(i^p,j^q) = 0
|
|
|
```
|
|
|
|
|
|
Implementation of this strategy can be seen in [project pursuit approach](https://en.wikipedia.org/wiki/Projection_pursuit). Another approach is to use the entropy. We know that the signals that have maximum joint entropy are mutually independent and this is what will go into the optimizer. The approach is called [Infomax](https://en.wikipedia.org/wiki/Infomax). [This interpretation](http://www.inf.fu-berlin.de/lehre/WS05/Mustererkennung/infomax/infomax.pdf) is relatively new.
|
|
|
Implementation of this strategy can be seen in [project pursuit approach](https://en.wikipedia.org/wiki/Projection_pursuit). Another way is to use the concept of entropy as a measure. We know that the signals that have maximum joint entropy are mutually independent and this is what will go into the optimizer. The approach is called [Infomax](https://en.wikipedia.org/wiki/Infomax). [This interpretation](http://www.inf.fu-berlin.de/lehre/WS05/Mustererkennung/infomax/infomax.pdf) is relatively new.
|
|
|
|
|
|
If the probability of the getting an outcome of $`X_t`$ is $`p_X(X_t)`$, then the entropy in average of N observation can be found as:
|
|
|
|
... | ... | |