We want to bring to your attention an Entropy Special Issue on "Information Flow in Neural Systems" that will be guest edited by Pulkit Grover, Chris Quinn, and Gabe Schamberg. The issue will focus on information theoretic notions of information flow in both natural and artificial neural networks.Ìý
Further instructions for submission and a more detailed description of the special issue can be found here: https://www.mdpi.com/journal/entropy/special_issues/neural_system , and is also included below.
You are welcome to pass on the announcement to anyone that may be interested. Please feel free to reach out with any questions and we will be happy to help.
Pulkit, Chris,Ìý and Gabe
---------------------------
Dear Colleagues,
In an exciting confluence, information flows in neural networks—both artificial and natural—are garnering immense interest. While the term information flow is used frequently in practical contexts, such as clinical neuroscience or the optimization/interpretation of artificial neural networks, fundamental exploration of the topic has received limited attention. Societal implications of defining, understanding, designing, and/or affecting information flows are deep and broad, influencing all aspects of our lives. Many of these issues require careful and rigorous approaches that have only recently begun being developed.
This Special Issue focuses on core information theoretic issues pertaining to flows of information in natural and artificial neural networks. Information theory here is to be interpreted broadly, including, for instance, classical (Shannon) information theory, algorithmic information theory, control theory, and integrated information theory. The issue is intended to have a balanced representation between natural and artificial worlds, and papers connecting the two, or critiquing the perceived connection between the two, are also of interest. The Special Issue solicits papers that are, in their essence, intellectual and/or theoretical, although demonstration on real or synthetic datasets is encouraged when possible.
This Special Issue will assimilate the current approaches to the following (and related) topics:
- Axiomatic definitions, measures and/or estimators of information flow in neural systems;
- Models for controlling information flow;
- Scalability of information flow methods in high dimensional neural systems;
- Analysis and extensions of established information flow measures;
- Models of information flow in clinical neuroscientific settings;
- Connections between information flow and causality;
- Measurement of information flow in multimodal neural datasets;
- Relationships between artificial and natural neural systems.
Prof. Pulkit Grover
Prof. Christopher Quinn
Dr. Gabriel Schamberg
Guest Editors
Ìý
Manuscript Submission Information
Ìý
Manuscripts should be submitted online atÌý ÌýbyÌý ÌýandÌý . Once you are registered,Ìý . Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.
Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on theÌý Ìýpage.Ìý Ìýis an international peer-reviewed open access monthly journal published by MDPI.
Please visit theÌý Ìýpage before submitting a manuscript. TheÌý Ìýfor publication in thisÌý Ìýjournal is 1600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI'sÌý Ìýprior to publication or during author revisions.
Ìý
Keywords
- Information flow
- Neuroscience
- Artificial neural networks
- Information theory
- Control and dynamical systems
- Integrated information