Poster
in
Workshop: Workshop on Machine Learning and Compression
Prechastic Coding: An Alternative Approach to Neural Network Description Lengths
Paris Flood · Pietro LiĆ³
The minimum description length (MDL) principle has a rich history of informing neural network research and there are numerous algorithms for developing efficient neural network description lengths. Of these methods, prequential coding, based on the prequential approach to statistics, has proven to be highly successful. Despite its achievements, general prequential coding limits learning at each increment to a prefix of a given dataset - a constraint which is potentially misaligned with an effective learning process. In this paper we introduce prechastic coding, an alternative to the prequential approach which is based on a guided, noisy sequence of intermediate learning steps. In our experiments we determine that the prechastic coding can challenge prequential coding in certain scenarios, whilst also leaving significant potential for further improvement.