[Home ] [Archive]   [ فارسی ]  
:: Main :: About :: Current Issue :: Archive :: Search :: Submit :: Contact ::
:: Volume 4, Issue 2 (3-2011) ::
J. of Stat. Sci. 2011, 4(2): 193-209 Back to browse issues page
Improved Kullback-Leibler Upper Bound Baised on Convex Combination of k Rival Models
Abdolreza Sayyareh
Abstract:   (15697 Views)
In this paper we have established for the Kullback-Leibler divergence that the relative error is supperadditive. It shows that a mixture of k rival models gives a better upper bound for Kullback-Leibler divergence to model selection. In fact, it is shown that the mixed model introduce a model which is better than of the all rival models in the mixture or a model which is better than the worst rival model in the mixture.
Keywords: Convex Combination, Geometric Mean, Kullback-Leibler Risk, Mixture of Models, Model Selection, Relative Error
Full-Text [PDF 1376 kb]   (2279 Downloads)    
Type of Study: Research | Subject: Statistical Inference
Received: 2012/04/19 | Accepted: 2012/04/20
Send email to the article author

Add your comments about this article
Your username or Email:

CAPTCHA code


XML   Persian Abstract   Print


Download citation:
BibTeX | RIS | EndNote | Medlars | ProCite | Reference Manager | RefWorks
Send citation to:

Sayyareh A. Improved Kullback-Leibler Upper Bound Baised on Convex Combination of k Rival Models. J. of Stat. Sci.. 2011; 4 (2) :193-209
URL: http://jss.irstat.ir/article-1-115-en.html


Volume 4, Issue 2 (3-2011) Back to browse issues page
مجله علوم آماری – نشریه علمی پژوهشی انجمن آمار ایران Journal of Statistical Sciences
Persian site map - English site map - Created in 0.08 seconds with 31 queries by YEKTAWEB 3731