Linear Least Square

(Linear equn: y = mx+c & least square function S(m,c) =Σ [y-(mx+c)]2 )

 Linear least square methods allow us to study how variables are related. Let Yi = α + βXi  +ε i for i = 1, 2,.. N are independent random variables with means E(Yi) = α + βXi &  that the collection i is a random sample from a distribution with mean 0 and standard deviation σ , and that all parameters ( α , β & ε  ) are unknown. Independent variable are called regressors /covariates; dependent variable is called response variable/endogenous variable. ε  is the error. Independent variable,dataset1 :x1 dependent variable-dataset1: y1 Independent variable,dataset2 :x2 dependent variable-dataset2: y2 Independent variable,dataset3 :x3 dependent variable-dataset3: y3 Independent variable,dataset4 :x4 dependent variable-dataset4: y4 δs/δβ = 0 β + α = δs/δα = 0 β + α = β= Line of Best Fit y= x + α= Discrepancy in y (expt.value-best fit value) ε1=ε2=ε3=ε4= sum of square of deviation, S(α , β ) standard deviation( σ) Fitting a curve of Quadratic Function -I y =bx2 ; S(b) =Σ [y-bx2 ]2 δs/δb = 0 b= Best Fit Curve y= x2 Discrepancy in y (expt.value-best fit value) ε1=ε2=ε3=ε4= sum of square of deviation, S(α , β ) reference: http://en.wikipedia.org/wiki/List_of_statistical_packages standard deviation( σ) Fitting a curve of Quadratic Function -II y =γx2 +βx +α   S(γ,β,α)=Σ [y-(γx2+βx+α) ]2 δs/δγ =  0 γ + β + α = δs/δβ =  0 γ + β + α = δs/δα = 0 γ + β + α = γ = Best Fit Curve y =x2 +x + β = Discrepancy in y (expt.value-best fit value) ε1=ε2=ε3=ε4= α = sum of square of deviation, S(γ , β ,α ) standard deviation( σ)

Least Square Regression Line

 Corelation Coefficient Slope y-intercept No. of entries x x2 y y2 xy 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 Σx Σx2 Σy Σy2 Σxy mean(x)-(y) Give input of X Y-Value predicted corelation coefficient r=sxy / ( sxx  * syy ) sxx=Σx2- (Σx)2/n syy=Σy2- (Σy)2/n sxy=Σxy- (Σx)*(Σy)/n slope= sxy / sxx Y-intercept mean y- (slope*mean x)