Logo des Repositoriums
Zur Startseite
  • English
  • Deutsch
Anmelden
  1. Startseite
  2. SuUB
  3. Bibliographie HS Bremen
  4. A Dimensionality Reduction Method for Finding Least Favorable Priors with a Focus on Bregman Divergence
 

A Dimensionality Reduction Method for Finding Least Favorable Priors with a Focus on Bregman Divergence

Veröffentlichungsdatum
2022
Autoren
Goldenbaum, Mario  
Dytso, Alex  
Poor, H. Vincent  
Shamai Shitz, Shlomo  
Zusammenfassung
A common way of characterizing minimax estimators in point estimation is by moving the problem into the Bayesian estimation domain and finding a least favorable prior distribution. The Bayesian estimator induced by a least favorable prior, under mild conditions, is then known to be minimax. However, finding least favorable distributions can be challenging due to inherent optimization over the space of probability distributions, which is infinite-dimensional. This paper develops a dimensionality reduction method that allows us to move the optimization to a finite-dimensional setting with an explicit bound on the dimension. The benefit of this dimensionality reduction is that it permits the use of popular algorithms such as projected gradient ascent to find least favorable priors. Throughout the paper, in order to make progress on the problem, we restrict ourselves to Bayesian risks induced by a relatively large class of loss functions, namely Bregman divergences.
Schlagwörter
Bregman Divergence
Verlag
PMLR
Institution
Hochschule Bremen  
Fachbereich
Hochschule Bremen - Fakultät 4: Elektrotechnik und Informatik  
Dokumenttyp
Artikel/Aufsatz
Zeitschrift/Sammelwerk
Proceedings of Machine Learning Research  
Band
151
Startseite
8080
Endseite
8094
Sprache
Englisch

Built with DSpace-CRIS software - Extension maintained and optimized by 4Science

  • Datenschutzbestimmungen
  • Endnutzervereinbarung
  • Feedback schicken