Usually, log-likelihood functions fail to satisfy the classical assumptions of strong convexity and Lipschitz-continuity of the gradient (as well as many of their mild counterparts) that are common in general convergence results for stochastic gradi- ent descent algorithms. Therefore, the use of gradient descent schemes to track the maxima of a sequence of objective log-likelihood functions suffers from the lack of theoretical results that guarantee the validity of the method. In this paper, we propose a simplified online scheme to track unknown dynamic parameters that are the optima of a sequence of objective log-likelihood functions. Under a Lipschitz assumption on the time varying optimum we demonstrate that our estimator achieves mean square convergence up to a neighborhood of the optimum, and we establish that the Lipschitz continuity assumption is necessary when a specific desirable property is imposed. The method is inspired by a Taylor expansion of the log-likelihood function around the maximum likelihood estimator, and rigorously justified by the expression for the Riemannian gradient of the log-likelihood of a multivariate Gaussian distribution.
Bernardi, E., Lanconelli, A., Lauria, C.S.A. (2025). Tracking Time Varying Parameters Via Online Simplified Maximum Likelihood. JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 206(1 (July)), 1-24 [10.1007/s10957-025-02716-2].
Tracking Time Varying Parameters Via Online Simplified Maximum Likelihood
Bernardi, EnricoPrimo
Investigation
;Lanconelli, Alberto
Secondo
Investigation
;Lauria, Christopher S. A.Ultimo
Investigation
2025
Abstract
Usually, log-likelihood functions fail to satisfy the classical assumptions of strong convexity and Lipschitz-continuity of the gradient (as well as many of their mild counterparts) that are common in general convergence results for stochastic gradi- ent descent algorithms. Therefore, the use of gradient descent schemes to track the maxima of a sequence of objective log-likelihood functions suffers from the lack of theoretical results that guarantee the validity of the method. In this paper, we propose a simplified online scheme to track unknown dynamic parameters that are the optima of a sequence of objective log-likelihood functions. Under a Lipschitz assumption on the time varying optimum we demonstrate that our estimator achieves mean square convergence up to a neighborhood of the optimum, and we establish that the Lipschitz continuity assumption is necessary when a specific desirable property is imposed. The method is inspired by a Taylor expansion of the log-likelihood function around the maximum likelihood estimator, and rigorously justified by the expression for the Riemannian gradient of the log-likelihood of a multivariate Gaussian distribution.| File | Dimensione | Formato | |
|---|---|---|---|
|
s10957-025-02716-2.pdf
accesso aperto
Tipo:
Versione (PDF) editoriale / Version Of Record
Licenza:
Licenza per Accesso Aperto. Creative Commons Attribuzione (CCBY)
Dimensione
984.85 kB
Formato
Adobe PDF
|
984.85 kB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.


