Comparing processes or models is of interest in various applications. Among the existing approaches, one of the most popular methods is to use the Kullback-Leibler (KL) divergence which is related to Shannon's entropy. Similarly, the Rényi divergence of order α can be deduced from the Rényi entropy. When α tends to 1, it leads to the KL divergence. In this paper, our purpose is to derive the expression of the Rényi divergence between the probability density functions of k consecutive samples of two real first-order moving average (MA) processes by using the eigen-decompositions of their Toeplitz correlation matrices. The resulting expression is compared with the expressions of the Rao distance and the Jeffrey's divergence (JD) based on the eigenvalues. The way these quantities evolve when k increases is then presented. When dealing with unit-zero MA processes, the derivate is infinite for the JD and finite for the others. The influence of α is also studied.
Rényi Divergence to Compare Moving-Average Processes / Merchan, Fernando; Grivel, Eric; Diversi, Roberto. - ELETTRONICO. - (2018), pp. 149-153. (Intervento presentato al convegno 20th IEEE Statistical Signal Processing Workshop, SSP 2018 tenutosi a Freiburg, Germany nel 2018) [10.1109/SSP.2018.8450711].
Rényi Divergence to Compare Moving-Average Processes
Diversi, Roberto
2018
Abstract
Comparing processes or models is of interest in various applications. Among the existing approaches, one of the most popular methods is to use the Kullback-Leibler (KL) divergence which is related to Shannon's entropy. Similarly, the Rényi divergence of order α can be deduced from the Rényi entropy. When α tends to 1, it leads to the KL divergence. In this paper, our purpose is to derive the expression of the Rényi divergence between the probability density functions of k consecutive samples of two real first-order moving average (MA) processes by using the eigen-decompositions of their Toeplitz correlation matrices. The resulting expression is compared with the expressions of the Rao distance and the Jeffrey's divergence (JD) based on the eigenvalues. The way these quantities evolve when k increases is then presented. When dealing with unit-zero MA processes, the derivate is infinite for the JD and finite for the others. The influence of α is also studied.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.