본문 바로가기

통계

Akaike information criterion


Akaike information criterion, AIC

From Wikipedia, the free encyclopedia

The Akaike information criterion is a measure of the relative goodness of fit of a statistical model


It was developed by Hirotsugu Akaike, under the name of "an information criterion" (AIC), and was first published by Akaike in 1974.[1] 


It is grounded in the concept of information entropy

in effect offering a relative measure of theinformation lost when a given model is used to describe reality. 


It can be said to describe the trade off between bias and variance in model construction, 

or loosely speaking between accuracy and complexity of the model.


AIC values provide a means for model selection


AIC does not provide a test of a model in the sense of testing a null hypothesis

; i.e. AIC can tell nothing about how well a model fits the data in an absolute sense. 


If all the candidate models fit poorly, AIC will not give any warning of that.


http://en.wikipedia.org/wiki/Akaike_information_criterion

'통계' 카테고리의 다른 글

q-values  (0) 2016.11.29
MDS PCA PCOA  (0) 2016.11.23
false positive, false negative, sensitivity, specificity  (1) 2013.05.08
베이지안 모델,  (0) 2012.05.11
마르코프 연쇄, Markov chain  (0) 2011.12.15
time series, 시계열 분석  (0) 2011.11.18
식생의 연속체설과 서열기법의 발전,  (0) 2011.11.11
Sørensen similarity index  (1) 2011.11.11
스크랩) 이 땅, 통계학의 오늘1 - 최종후  (0) 2011.11.10
[Biological Statistics] ANOVA, in SPSS  (0) 2009.10.14