μ=E(X)=Σxp(x),∫xp(x)dxE(g(X))=Σg(x)p(x),∫g(x)p(x)dx E(aX)=aE(X)E(X+b)=E(X)+b ∫2f(x)dx=2∫f(x)dx∫(x+2)f(x)dx=∫xf(x)dx+∫2f(x)dx=∫xf(x)dx+2 E(aX+b)=aE(X)+bE(2X+3)=2E(X)+3,E(X)=5 1,2,2,3,3,3,3,4,4,5 -> Small Variance
1,1,2,2,3,3,4,4,5,5 -> Large Variance
σ2=Var(X)=E[(X−μ)2]E[X2−2μX+μ2]=E[X2]−2μE[x]+μ2Var(X)=E[X2]−μ2 f(x)=21​(x+1),−1<x<1 μ=∫−11​xf(x)dx=1/2∫−11​(x2+x)dx=1/3Var(X)=∫−11​x2f(x)dx−μ2=2/9 Median: P(X≤1/2)andP(X≥1/2)
Mode: argmaxx​P(X=x)
Moment generating function
etX=1+tX+2!t2X2​+3!t3X3​+⋯+n!tnXn​+⋯ M(t)=E(etX)=∫etxf(x)dxM′(t)=dtd​E(etX)=∫xetxf(x)dxM′′(t)=d2td​E(etX)=∫x2etxf(x)dx M′(t)∣t=0​=E(X)M′′(t)∣t=0​=E(X2)⋮M(n)(t)∣t=0​=E(Xn) Moment generating function can determine the distribution. E(Xn)means nth​moment.
Let's assume X∼U(0,1),Y∼U[0,1]
These two random variables make same distribution, but the form of pdf is different. For this reason, pdf can't determine the unique distribution. Only mgf(moment generating function) and cdf(cumulative density function) uniquely define the density function.
X∼Binom(n,p) (1) E(X)=np
E(X)​=x=0∑n​x(xn​)px(1−p)n−x=x=1∑n​(x−1)!(n−x)!n!​px(1−p)n−x=x=1∑n​np(x−1)!(n−x)!(n−1)!​px−1(1−p)n−x=np​ (2) Var(X)=np(1−p)
E(X2)​=x=1∑n​xnp(x−1)!(n−x)!(n−1)!​px−1(1−p)n−x=x=1∑n​(x−1)np(x−1)!(n−x)!(n−1)!​px−1(1−p)n−x+np=x=2∑n​(n−1)np2(x−2)!(n−x)!(n−2)!​px−2(1−p)n−x+np=(n−1)np2+np​ Var(X)=E(X2)−E(X)2=(n−1)np2+np−n2p2=np(1−p) (3) median of X
P(X≤m)=x=0∑m​(xn​)px(1−p)n−x=1/2 (Case by Case)
(4) mode of X