'''sum of squares (SS)''' Sub: sum of squares error - [[제곱합오차,square_sum_error]] ... [[오차,error]] residual sum of squares (RSS) = sum of squared residuals (SSR) = sum of squared estimate of errors (SSE) ''tmp curr [[잔차,residual]]'' ---- MKLINK [[제곱,square]] [[합,sum]] [[제곱근,square_root]] [[노름,norm]] esp L2 norm [[분산,variance]]은 제곱합 나누기 dof. Srch:sum_of_sq [[평균제곱오차,mean_square_error,MSE]] [[선형회귀,linear_regression]]는 [[근사해,approximate_solution]]와 (...)의 [[차이,difference]]([[오차,error]])의 '''제곱합,square_sum'''을 최소화하는 [[해,solution]]를 찾는 것에 대한??? 그 외의 다른 방법도 있음? CHK Ggl:선형회귀 rel [[power_sum]] - WpEn:Sums_of_powers [[reciprocal_sum]] - WpEn:Sum_of_reciprocals ---- Twins: [[WpKo:제곱합]] = https://ko.wikipedia.org/wiki/제곱합 [[WpEn:Sum_of_squares]] ... Naver:제곱합 Google:제곱합 Google:sum.of.squares Up: [[제곱,square]] [[합,sum]]