This work demonstrates a formal connection between density estimation with a data-rate constraint and the joint objective of fixed-rate universal lossy source coding and model identification.
Shannon entropy estimation in countably infinite alphabets is addressed by adopting convergence results of the entropy functional and concentration inequalities.
A new histogram-based mutual information estimator using data-driven tree-structured partitions (TSP) is presented in this work.