2000字范文,分享全网优秀范文,学习好帮手!
2000字范文 > 机器学习导论(张志华):主元分析

机器学习导论(张志华):主元分析

时间:2020-11-04 20:47:10

相关推荐

机器学习导论(张志华):主元分析

前言

这个笔记是北大那位老师课程的学习笔记,讲的概念浅显易懂,非常有利于我们掌握基本的概念,从而掌握相关的技术。

basic concepts

exp(−tz12)=∫exp(−tuz)dF(u)exp(-tz^{\frac{1}{2}}) =\int exp(-tuz) dF(u)exp(−tz21​)=∫exp(−tuz)dF(u)

z=∣∣x∣∣2z=||x||^2z=∣∣x∣∣2

exp(−t∣∣x∣∣),exp(−t∣∣x∣∣).exp(-t||x||),exp(-t||x||).exp(−t∣∣x∣∣),exp(−t∣∣x∣∣).

The product of P.D is P.D

eul distance transformed into another space to get the distance.

∣∣ϕ(x)−ϕ(y)∣∣22||\phi(x)-\phi(y)||^2_2∣∣ϕ(x)−ϕ(y)∣∣22​

Part2 unsuperrised learning

CB dimensionlity reduction.

PCA(Principal Component Analysis)

Population PCA

Def. if x‾⊂Rpisarandomvector,withmean:uandcovariancematrixσ\overline x \subset R^p \quad is\quad a\quad random \quad vector, \quad with \quad mean:u \quad and \quad covariance \quad matrix \sigmax⊂Rpisarandomvector,withmean:uandcovariancematrixσ

then the PCA is

x‾−>y‾=Ut(x−u)\overline x-> \overline y=U^t(x-u)x−>y​=Ut(x−u)

when U is orthgonal.

Spectral Decompistion

Thm,

Ifx−>N(μ,σ)If x->N(\mu,\sigma)Ifx−>N(μ,σ) Then,yN(0,n)y~N(0,n)yN(0,n)

(2)E(y0)=0,E(y_0)=0,E(y0​)=0,

(3)Cov(Ym,Yi)=0fori!=jCov(Y_m,Y_i)=0 for i !=j Cov(Ym​,Yi​)=0fori!=j

(4)yisaorthangonaltransformxisuncorrelationbutotsqure.y \quad is\quad a \quad orthangonal \quad transform \quad x \quad is \quad uncorrelation \quad but \quad ot \quad squre. yisaorthangonaltransformxisuncorrelationbutotsqure.

(5)Var(Yi)=σiVar(Y_i)=\sigma_iVar(Yi​)=σi​

Sample Principal Component

LetX=[x‾1...x‾n]Tbean∗pLet X=[\overline x_1 ...\overline x_n]^T be\quad a \quad n*p LetX=[x1​...xn​]Tbean∗p

sample data matrix

x‾=1n∑x=1nx‾i,\overline x=\frac{1}{n} \sum_{x=1}^n \overline x_i,x=n1​x=1∑n​xi​,

S=1nXTHXS=\frac{1}{n}X^THXS=n1​XTHX

H:In=1nInInH:I_n=\frac{1}{n}I_nI_nH:In​=n1​In​In​

reduce the data to k-dimension ,you get the first k element.

keep most information,PCA.suppos.

SVD

U=eigenvectorof(AAT)U=eigenvectorof(AA^T)U=eigenvectorof(AAT)

D=AATD=\sqrt{AA^T}D=AAT​

V=eigenvector(ATA)V=eigenvector(A^TA)V=eigenvector(ATA)

PCO(Principal Coordinate Analysis)

S=XTHXS=X^THXS=XTHX

power equal : HH=H

B=HXXTHB=HXX^THB=HXXTH

variance matrix

AB=BA

Non-zero eigenvector are equal.

本内容不代表本网观点和政治立场,如有侵犯你的权益请联系我们处理。
网友评论
网友评论仅供其表达个人看法,并不表明网站立场。