Paul Honeine, Entropies of Overcomplete Kernel Dictionaries, BMSA Volume 16, Bulletin of Mathematical Sciences and Applications (Volume 16)
https://www.scipress.com/BMSA.16.1
Abstract:
    In signal analysis and synthesis, linear approximation theory considers a linear decomposition of any given signal in a set of atoms, collected into a so-called dictionary. Relevant sparse representations are obtained by relaxing the orthogonality condition of the atoms, yielding overcomplete dictionaries with an extended number of atoms. More generally than the linear decomposition, overcomplete kernel dictionaries provide an elegant nonlinear extension by defining the atoms through a mapping kernel function (e.g., the gaussian kernel). Models based on such kernel dictionaries are used in neural networks, gaussian processes and online learning with kernels. The quality of an overcomplete dictionary is evaluated with a diversity measure the distance, the approximation, the coherence and the Babel measures. In this paper, we develop a framework to examine overcomplete kernel dictionaries with the entropy from information theory. Indeed, a higher value of the entropy is associated to a further uniform spread of the atoms over the space. For each of the aforementioned diversity measures, we derive lower bounds on the entropy. Several definitions of the entropy are examined, wth an extensive analysis in both the input space and the mapped feature space.
Keywords:
    Dictionary Learning, Generalized Rényi Entropy, Gram Matrix, Kernel-Based Methods, Machine Learning, Pattern Recognition, Shannon Entropy, Sparse Approximation, Tsallis Entropy