Â鶹´«Ã½AV

Shannon Bounds for Quadratic Rate-Distortion Problems

Submitted by admin on Wed, 10/23/2024 - 01:52

The Shannon lower bound has been the subject of several important contributions by Berger. This paper surveys Shannon bounds on rate-distortion problems under mean-squared error distortion with a particular emphasis on Berger’s techniques. Moreover, as a new result, the Gray-Wyner network is added to the canon of settings for which such bounds are known. In the Shannon bounding technique, elegant lower bounds are expressed in terms of the source entropy power.

Dynamic Group Testing to Control and Monitor Disease Progression in a Population

Submitted by admin on Wed, 10/23/2024 - 01:52

Proactive testing and interventions are crucial for disease containment during a pandemic until widespread vaccination is achieved. However, a key challenge remains: Can we accurately identify all new daily infections with only a fraction of tests needed compared to testing everyone, everyday?

Tightening Continuity Bounds for Entropies and Bounds on Quantum Capacities

Submitted by admin on Wed, 10/23/2024 - 01:52

Uniform continuity bounds on entropies are generally expressed in terms of a single distance measure between probability distributions or quantum states, typically, the total variation-or trace distance. However, if an additional distance measure is known, the continuity bounds can be significantly strengthened. Here, we prove a tight uniform continuity bound for the Shannon entropy in terms of both the local-and total variation distances, sharpening an inequality in I. Sason, Â鶹´«Ã½AV Trans. Inf. Th., 59, 7118 (2013).

Statistical Inference With Limited Memory: A Survey

Submitted by admin on Wed, 10/23/2024 - 01:52

The problem of statistical inference in its various forms has been the subject of decades-long extensive research. Most of the effort has been focused on characterizing the behavior as a function of the number of available samples, with far less attention given to the effect of memory limitations on performance. Recently, this latter topic has drawn much interest in the engineering and computer science literature.

An Information-Theoretic Approach to Unsupervised Feature Selection for High-Dimensional Data

Submitted by admin on Wed, 10/23/2024 - 01:52

In this paper, we propose an information-theoretic approach to design the functional representations to extract the hidden common structure shared by a set of random variables. The main idea is to measure the common information between the random variables by Watanabe’s total correlation, and then find the hidden attributes of these random variables such that the common information is reduced the most given these attributes.

Energy-Reliability Limits in Nanoscale Feedforward Neural Networks and Formulas

Submitted by admin on Wed, 10/23/2024 - 01:52

Due to energy-efficiency requirements, computational systems are now being implemented using noisy nanoscale semiconductor devices whose reliability depends on energy consumed. We study circuit-level energy-reliability limits for deep feedforward neural networks (multilayer perceptrons) built using such devices, and en route also establish the same limits for formulas (boolean tree-structured circuits).

PacGAN: The Power of Two Samples in Generative Adversarial Networks

Submitted by admin on Wed, 10/23/2024 - 01:52

Generative adversarial networks (GANs) are innovative techniques for learning generative models of complex data distributions from samples. Despite remarkable improvements in generating realistic images, one of their major shortcomings is the fact that in practice, they tend to produce samples with little diversity, even when trained on diverse datasets. This phenomenon, known as mode collapse, has been the main focus of several recent advances in GANs.