Principle Component Analysis
PCA
PCA is to find the dominant directions of the point cloud
Applications:
- Dimensionality reduction
- Surface normal estimation
- Canonical orientation
- Keypoint detection
- Feature description
Physical intuitions:
- Vector Dot Product
- Matrix-Vector Multiplication
- Singular Value Decomposition (SVD)
Spectral Theorem:
A=UΛUT=i=1∑nλiuiuiT,Λ=diag(λ1,⋯,λn)
Rayleigh Quotients:
λmin(A)≤xTxxTAx≤λmax(A),∀x=0
Summary:
- Normalized by the center:
X~=[x~1,⋯,x~m],x~i=xi−x
- Compute SVD:
H=X~X~T=UrΣ2UrT
- The principle vectors are the columns of Ur (Eigenvector of X= Eigenvector of H)
KPCA
H~z~=λ~z~→z~=j=1∑Nαjϕ(xj)→Kα=λα
The normalization of z~:
αrTλrαr=1→αr=1/λr
Kernel:
- Linear k(xi,xj)=xiTxj
- Polynomial k(xi,xj)=(1+xiTxj)p
- Gaussian k(xi,xj)=e−β∣∣xi−xj∣∣2
- Laplacian k(xi,xj)=e−β∣∣xi−xj∣∣1
Summary
- Select a kernel k(xi,xj),compute the Gram matrix K(i,j)=k(xi,xj)
- Normalize K:
K~=K−2IIN1K+IIN1KIIN1
- Solve the eigenvector/eigenvalues of K~:
K~αr=λrαr
- Normalize αr to be αrTαr=λr1
- For any data point x∈Rn, compute its projection onto rth principle component yr∈R:
yr=ϕT(x)z~r=j=1∑Nαrjk(x,xj)