Very High Throughput Satellites (VHTS) surpass the capacity of traditional systems providing FSS and BSS (fixed and broadcasting satellite services, respectively) using multi-beam coverage. The objective of VHTS systems is to achieve a satellite capacity of 1 Terabit/s in the near future. These systems provide greater satellite capacity at a reduced cost per Gbps in orbit, but further optimization is needed to use the full capacity of the satellite over time as traffic demand is non-uniform and changing over time. In other words, VHTS systems require flexible payloads to meet changing traffic demands. This paper presents a solution for the automatic management of a flexible payload architecture using a Neural Network and considering resource allocation as a classification problem.
Flor G. Ortíz-Gómez, Ramón Martínez Rodríguez-Osorio, Miguel A. Salas-Natera, Salvador Landeros-Ayala, Daniele Tarchi, Vanelli Coralli, A. (2019). On the Use of Neural Networks for Flexible Payload Management in VHTS Systems.
On the Use of Neural Networks for Flexible Payload Management in VHTS Systems
Daniele Tarchi;Vanelli Coralli, Alessandro
2019
Abstract
Very High Throughput Satellites (VHTS) surpass the capacity of traditional systems providing FSS and BSS (fixed and broadcasting satellite services, respectively) using multi-beam coverage. The objective of VHTS systems is to achieve a satellite capacity of 1 Terabit/s in the near future. These systems provide greater satellite capacity at a reduced cost per Gbps in orbit, but further optimization is needed to use the full capacity of the satellite over time as traffic demand is non-uniform and changing over time. In other words, VHTS systems require flexible payloads to meet changing traffic demands. This paper presents a solution for the automatic management of a flexible payload architecture using a Neural Network and considering resource allocation as a classification problem.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.