New Developments in Statistical Information Theory Based on Entropy and Divergence Measures - Leandro Pardo - 書籍 - Mdpi AG - 9783038979364 - 2019年5月20日
カバー画像とタイトルが一致しない場合、正しいのはタイトルです

New Developments in Statistical Information Theory Based on Entropy and Divergence Measures

価格
¥ 10.767
税抜

遠隔倉庫からの取り寄せ

発送予定日 年12月17日 - 年12月30日
クリスマスプレゼントは1月31日まで返品可能です
iMusicのウィッシュリストに追加

This book presents new and original research in Statistical Information Theory, based on minimum divergence estimators and test statistics, from a theoretical and applied point of view, for different statistical problems with special emphasis on efficiency and robustness. Divergence statistics, based on maximum likelihood estimators, as well as Wald's statistics, likelihood ratio statistics and Rao's score statistics, share several optimum asymptotic properties, but are highly non-robust in cases of model misspecification under the presence of outlying observations. It is well-known that a small deviation from the underlying assumptions on the model can have drastic effect on the performance of these classical tests. Specifically, this book presents a robust version of the classical Wald statistical test, for testing simple and composite null hypotheses for general parametric models, based on minimum divergence estimators.

メディア 書籍     Paperback Book   (ソフトカバーで背表紙を接着した本)
リリース済み 2019年5月20日
ISBN13 9783038979364
出版社 Mdpi AG
ページ数 344
寸法 170 × 244 × 24 mm   ·   734 g
言語 英語  

すべて表示

Leandro Pardoの他の作品を見る