Skip to content
GitLab
Projects Groups Snippets
  • /
  • Help
    • Help
    • Support
    • Community forum
    • Submit feedback
  • Sign in
  • Data Driven Engineering Data Driven Engineering
  • Project information
    • Project information
    • Activity
    • Members
  • Repository
    • Repository
    • Files
    • Commits
    • Branches
    • Tags
    • Contributors
    • Graph
    • Compare
    • Locked Files
  • Deployments
    • Deployments
    • Releases
  • Packages and registries
    • Packages and registries
    • Package Registry
    • Infrastructure Registry
  • Wiki
    • Wiki
  • Snippets
    • Snippets
  • Activity
  • Graph
  • Commits
Collapse sidebar
  • cihan.ates
  • Data Driven EngineeringData Driven Engineering
  • Wiki
  • Dde 1
  • Dimensionality reduction

Dimensionality reduction · Changes

Page history
Update Dimensionality reduction authored Nov 26, 2021 by cihan.ates's avatar cihan.ates
Show whitespace changes
Inline Side-by-side
DDE-1/Dimensionality-reduction.md
View page @ b8bf9ab0
...@@ -23,6 +23,8 @@ The machine learning models we have covered so far can also be interpreted from ...@@ -23,6 +23,8 @@ The machine learning models we have covered so far can also be interpreted from
## Dimensionality reduction methods ## Dimensionality reduction methods
### Principle Component Analysis
PCA is one of the most popular dimensionality reduction methods. It is a linear, orthogonal projection method where the high dimensional data is reflected onto a lower dimensional space in a way the variance in the projected data is maximized. We can again make an analogy with the shadow game. This time our objective is to find the right direction for the light so that the features of the object with high dimensions (3D) is kept as much as possible in the lower dimensional space (2D). In other words, we will perform the data projection in a way that it minimizes the information loss. PCA is one of the most popular dimensionality reduction methods. It is a linear, orthogonal projection method where the high dimensional data is reflected onto a lower dimensional space in a way the variance in the projected data is maximized. We can again make an analogy with the shadow game. This time our objective is to find the right direction for the light so that the features of the object with high dimensions (3D) is kept as much as possible in the lower dimensional space (2D). In other words, we will perform the data projection in a way that it minimizes the information loss.
How does the data compression process work? We again have the data matrix X, where each row represents a different instance, while the columns (dimensions) are the features. The process is distance based (Euclidian). How does the data compression process work? We again have the data matrix X, where each row represents a different instance, while the columns (dimensions) are the features. The process is distance based (Euclidian).
......
Clone repository
Home
  • Recommended resources
  • Toolbox
  • Miscellaneous topics
DDE 1: ML for Dynamical systems
  • Complex Systems
  • Ode to Learning
  • Regression
  • Classification
  • Clustering
  • Dimensionality Reduction
  • Outlier Detection
DDE 2: Advanced topics
  • Evolutionary learning

Imprint