Ìý
Special Issue in Entropy: "Information Decomposition of Target Effects from Multi-Source Interactions"
Submission Deadline
:Ìý
May 31, 2017
Ìý (open for submission now!)
Shannon information theory has provided rigorous ways to capture our intuitive notions regarding uncertainty and information, and made an enormous impact in doing so. One of the fundamental measures here is mutual information, which captures the average information contained in one variable about another, and vice versa. If we have two source variables and a target, for example, we can measure the information held by one source about the target, the information held by the other source about the target, and the information held by those sources together about the target. Any other notion about the directed information relationship between these variables, which can be captured by classical information-theoretic measures (e.g., conditional mutual information terms) is linearly redundant with those three quantities.
However, intuitively, there is strong desire to measure further notions of how this directed information interaction may be decomposed, e.g., how much information the two source variables hold redundantly about the target, how much each source variable holds uniquely, and how much information can only be discerned by synergistically examining the two sources together. These notions go beyond the traditional information-theoretic view of a channel serving the purpose of reliable communication, considering now the situation of multiple communication streams converging on a single target. This is a common situation in biology, and in particular in neuroscience, where, say, the ability of a target to synergistically fuse multiple information sources in a non-trivial fashion is likely to have its own intrinsic value, independently of reliability of communication.
The absence of measures for such decompositions into redundant, unique and synergistic information is arguably the most fundamental missing piece in classical information theory. Triggered by the formulation of the Partial Information Decomposition framework by
in 2010, the past few years have witnessed a concentration of work by the community in proposing, contrasting, and investigating new measures to capture these notions of information decomposition. Other theoretical developments consider how these measures relate to concepts of information processing in terms of storage, transfer and modification. Meanwhile, computational neuroscience has emerged as a primary application area due to significant interest in questions surrounding how target neurons integrate information from large numbers of sources, as well as the availability of data sets to investigate these questions on.
This
Special
Issue
seeks to bring together these efforts, to capture a snapshot of the current research, as well as to provide impetus for and focused scrutiny on newer work. We also seek to present progress to the wider community and attract further research. We welcome research articles proposing new measures or pointing out future directions, review articles on existing approaches, commentary on properties and limitations of such approaches, philosophical contributions on how such measures may be used or interpreted, applications to empirical data (e.g., neural imaging data), and more.
Submission information
Please see the
special
issue
Ìý
for full details.
Manuscripts can be submitted until the deadline. Papers will be published continuously (as soon as accepted) and will be listed together on the
special
issue
website. Research articles, review articles as well as communications are invited. Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers).
For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on the website.
Special Issue Editors :
- Dr. Joseph Lizier; Centre for Complex Systems, Faculty of Engineering and IT, The University of Sydney, Australia
- Dr. Nils Bertschinger; Frankfurt Institute of Advanced Studies (FIAS), Frankfurt, Germany
- Prof. Juergen Jost; Max Planck Institute for Mathematics in the Sciences, Leipzig, Germany and Santa Fe Institute, NM, USA
- Prof. Michael Wibral; MEG Unit, Brain Imaging Center, Goethe University, Frankfurt, Germany