Linear least square methods allow us to
study how variables are related. Let Yi =
α +
βXi
+ε i
for i = 1, 2,.. N are independent random variables with means
E(Yi) = α
+ βXi
& that the collection i is a random sample from a distribution with
mean 0 and standard deviation σ , and that all parameters ( α
, β &
ε
) are unknown. Independent variable are called
regressors /covariates; dependent variable is called response
variable/endogenous variable. ε is
the error. 
Independent variable,dataset1 :x1 

dependent variabledataset1: y1 

Independent variable,dataset2 :x2 

dependent variabledataset2: y2 

Independent variable,dataset3 :x3 

dependent variabledataset3: y3 

Independent variable,dataset4 :x4 

dependent variabledataset4: y4 

δs/δβ = 0 
β +
α =

δs/δα = 0 
β +
α =

β= 

Line of Best Fit 
y=
x
+

α= 

Discrepancy in y (expt.valuebest fit value) 
ε1=ε2=ε3=ε4= 


sum of square of deviation, S(α , β ) 



standard deviation( σ) 





Fitting a curve of Quadratic Function I 
y =bx^{2}
; S(b)
=Σ [ybx^{2} ]^{2} 
δs/δb = 0 
b= 
Best Fit Curve 
y= x^{2} 


Discrepancy in y (expt.valuebest fit value) 
ε1=ε2=ε3=ε4= 


sum of square of deviation, S(α , β ) 

reference:
http://en.wikipedia.org/wiki/List_of_statistical_packages 
standard deviation( σ) 






Fitting a curve of Quadratic Function II 
y =γx^{2}
+βx +α
S(γ,β,α)=Σ [y(γx^{2}+βx+α)
]^{2} 
δs/δγ = 0 
γ
+ β
+ α =


δs/δβ = 0 
γ
+ β
+ α =


δs/δα = 0 
γ
+ β
+ α =


γ = 

Best Fit Curve 
y =x^{2}
+x + 
β = 

Discrepancy in y (expt.valuebest fit value) 
ε1=ε2=ε3=ε4= 
α = 

sum of square of deviation, S(γ , β
,α ) 



standard deviation( σ) 








