Upload structured values, choose scaling, and compare components. Review covariance, eigenvalues, loadings, and projected scores. Discover compact dimensions that preserve meaningful variation across variables.
Enter a numeric matrix, choose preprocessing options, and calculate principal components. The result section will appear above this form.
Use this sample if you want to test the calculator quickly.
| Feature_A | Feature_B | Feature_C | Feature_D |
|---|---|---|---|
| 2.5 | 2.4 | 1.2 | 8.0 |
| 0.5 | 0.7 | 0.3 | 2.4 |
| 2.2 | 2.9 | 1.8 | 7.5 |
| 1.9 | 2.2 | 1.1 | 6.3 |
| 3.1 | 3.0 | 2.0 | 9.1 |
| 2.3 | 2.7 | 1.5 | 7.2 |
| 2.0 | 1.6 | 0.9 | 5.8 |
| 1.0 | 1.1 | 0.4 | 3.1 |
1. Mean centering: x′ij = xij − μj, where μj is the mean of variable j.
2. Standardization: zij = (xij − μj) / sj, where sj is the sample standard deviation.
3. Covariance or correlation matrix: S = (ZTZ) / (n − 1).
4. Eigen decomposition: S vk = λk vk, where λk is the variance captured by component k.
5. Explained variance ratio: EVRk = λk / Σλ.
6. Component scores: T = ZVr, where Vr contains the retained eigenvectors.
7. Loadings: L = V diag(√λ), which measures how strongly each variable contributes to each retained component.
PCA transforms many correlated variables into fewer uncorrelated components. It keeps as much variance as possible while reducing dimension, simplifying visualization, modeling, and feature compression.
Standardize when variables use different units or ranges. Without scaling, variables with larger numeric spreads can dominate the covariance matrix and distort the retained components.
Each eigenvalue measures the variance captured by one principal component. Larger eigenvalues indicate components that preserve more information from the original dataset.
Common rules include reaching a cumulative variance target, inspecting the scree plot elbow, or choosing components with strong interpretability for the problem you are solving.
Loadings show how strongly each original variable contributes to a principal component. Large positive or negative values suggest a variable is influential in that direction.
Scores are the transformed coordinates of each observation in component space. They help you detect clusters, trends, outliers, and separation patterns after reduction.
This calculator expects complete numeric data. If values are missing, clean or impute them first so the covariance matrix and eigen decomposition remain valid.
Lower-ranked components usually explain little variance. Ignoring them reduces noise, simplifies the feature space, and keeps the most informative directions for analysis.
Important Note: All the Calculators listed in this site are for educational purpose only and we do not guarentee the accuracy of results. Please do consult with other sources as well.