AU - Sayyareh, Abdolreza TI - Improved Kullback-Leibler Upper Bound Baised on Convex Combination of k Rival Models PT - JOURNAL ARTICLE TA - JSS JN - JSS VO - 4 VI - 2 IP - 2 4099 - http://jss.irstat.ir/article-1-115-en.html 4100 - http://jss.irstat.ir/article-1-115-en.pdf SO - JSS 2 ABĀ  - In this paper we have established for the Kullback-Leibler divergence that the relative error is supperadditive. It shows that a mixture of k rival models gives a better upper bound for Kullback-Leibler divergence to model selection. In fact, it is shown that the mixed model introduce a model which is better than of the all rival models in the mixture or a model which is better than the worst rival model in the mixture. CP - IRAN IN - Kermanshah LG - eng PB - JSS PG - 193 PT - Research YR - 2011