Skip to content
GitLab
Projects Groups Snippets
  • /
  • Help
    • Help
    • Support
    • Community forum
    • Submit feedback
  • Sign in
  • Data Driven Engineering Data Driven Engineering
  • Project information
    • Project information
    • Activity
    • Members
  • Repository
    • Repository
    • Files
    • Commits
    • Branches
    • Tags
    • Contributors
    • Graph
    • Compare
    • Locked Files
  • Deployments
    • Deployments
    • Releases
  • Packages and registries
    • Packages and registries
    • Package Registry
    • Infrastructure Registry
  • Wiki
    • Wiki
  • Snippets
    • Snippets
  • Activity
  • Graph
  • Commits
Collapse sidebar
  • cihan.ates
  • Data Driven EngineeringData Driven Engineering
  • Wiki
  • Dde 1
  • Dimensionality reduction

Dimensionality reduction · Changes

Page history
Update Dimensionality reduction authored Nov 29, 2021 by cihan.ates's avatar cihan.ates
Show whitespace changes
Inline Side-by-side
DDE-1/Dimensionality-reduction.md
View page @ 7493ff28
...@@ -157,7 +157,7 @@ S = WX ...@@ -157,7 +157,7 @@ S = WX
All we need to do is finding the values for $`\alpha_{ij}`$ in W. Note that we need to find W when A is unknown (we cannot simply use the inverse A). What we know though is, W defines the vectors in the mixture space and each vector (e.g. $`[\alpha_{11},\alpha_{12}`$]) basically extracts one source signal (here it is $`s_{1}`$). If you look at the above sketch of ICA, we see that these vectors must be orthogonal to the samples associated with all sources except the one it describes. So, we need to find W such that each vector in W is orthogonal to all sources but one. Okay, now we are getting closer to define an optimization problem. All we need to do is finding the values for $`\alpha_{ij}`$ in W. Note that we need to find W when A is unknown (we cannot simply use the inverse A). What we know though is, W defines the vectors in the mixture space and each vector (e.g. $`[\alpha_{11},\alpha_{12}`$]) basically extracts one source signal (here it is $`s_{1}`$). If you look at the above sketch of ICA, we see that these vectors must be orthogonal to the samples associated with all sources except the one it describes. So, we need to find W such that each vector in W is orthogonal to all sources but one. Okay, now we are getting closer to define an optimization problem.
We also said that we are after independent signals. By saying so, we assume that the sources reflect this property, than the merged signals. With this constraint, we can say, "I will find such a W that "independency" is maximized in the extracted signals. We also said that we are after the independent signals. By saying so, we assume that the sources do reflect this property, better than the merged signals at least. With this constraint, we can say, "I will find such a W that "the independency" is maximized in the extracted signals.
---------------------- ----------------------
......
Clone repository
Home
  • Recommended resources
  • Toolbox
  • Miscellaneous topics
DDE 1: ML for Dynamical systems
  • Complex Systems
  • Ode to Learning
  • Regression
  • Classification
  • Clustering
  • Dimensionality Reduction
  • Outlier Detection
DDE 2: Advanced topics
  • Evolutionary learning

Imprint