Monday, 15 November 2021

Periodic Spectral Ergodicity Accurately Predicts Deep Learning Generalisation

 Preamble 

    Dali (1931),
The Persistence of Memory (Wikipedia)

One of the new mathematical concepts arise due to understanding of deep learning is called periodic spectral ergodicity (PSE). The cascading PSE (cPSE) propagates over deep learning layers which can also be used as a complexity measure. cPSE actually can also predict the generalisation ability. In this post, we review this interesting  finding in an easy and short manner.

How periodic spectral ergodicity cascades over layers

We have reviewed spectral ergodicity in a gentle fashion earlier, here.  Only difference is that in real deep learning architectures, length of the eigenvalue spectrum, i.e., the number  of bins in the histogram, generated by weight matrices are not equal in size. To align them, we use something called periodic boundary conditions or turn the eigenvalues in a cyclic fashion, up to the maximum length spectra we have seen up to that layer. Here are the steps that give, the intuition of how to compute cascading periodic spectral ergodicity (cPSE).

1. We compute eigenvalue spectrum up to a layer $i$ and align the smaller spectrum with periodic boundary conditions, i.e., cyclic.

2. Compute spectral ergodicity at layers $i$ and $i-1$.

3. Compute the cascading PSE at layer $i$ simply with a distance metric $\Omega^{i}$  and $\Omega^{i-1}$. i.e.,  KL divergence in two directions, recall earlier tutorials.  

If we repeat this up to the last layer, cPSE measures the complexity of the deep learning architecture, both capturing structural and learning algorithm-wise, in a depth of a layer fashion. 

 Generalisation Gap and cPSE

Apart from being a complexity measure, cPSE predicts the generalisation gap given reference architecture i.e., it correlates with the performance almost perfectly. These findings are presented in the paper suzen2019 .

Conclusions and Outlook

The complexity of deep learning architectures are still an open research problem.  One of the most promising direction is to use cPSE in terms of capturing structural complexity as well. While other measures in the literature did not consider depth dependency, whereby cPSE appears to be the first one.

Reference

@article{suzen2019,
  title={Periodic Spectral Ergodicity: A Complexity Measure for Deep Neural Networks and Neural Architecture Search},
  author={S{\"u}zen, Mehmet and Cerd{\`a}, Joan J and Weber, Cornelius},
  journal={arXiv preprint arXiv:1911.07831},
  year={2019}
}

Cite this post as  Periodic Spectral Ergodicity Accurately Predicts Deep Learning Generalisation, Mehmet Süzen,  https://science-memo.blogspot.com/2021/11/periodic-spectral-ergodicity-predicts-generalisation-deep-learning.html 2021

Appendix 

Bristol v0.12.2 is now supporting in computing cPSE from list of matrices

from bristol import cPSE

import numpy as np

np.random.seed(42)

matrices = [np.random.normal(size=(64,64)) for _ in range(10)]

(d_layers, cpse) = cPSE.cpse_measure_vanilla(matrices) 



(c) Copyright 2008-2024 Mehmet Suzen (suzen at acm dot org)

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.