Workshop
Information-Theoretic Principles in Cognitive Systems (InfoCog)
Noga Zaslavsky · Rava Azeredo da Silveira · Ronit Bustin · Ron M. Hecht
Room 215 - 216
Fri 15 Dec, 6:15 a.m. PST
Information theory provides a mathematical framework allowing to formulate and quantify the basic limitations of data compression and communication. The notions of data compression and communication, based in analog and digital communication, are also relevant toother domains; as such, information theory spans a number of research fields. Aiming to formulate, understand, and quantify the storage and processing of information is a thread that ties together these disparate fields, and especially the study of cognition in humans and machines. Specifically, the desire to reach an integrative computational theory of human and artificial cognition, is attempted by leveraging information-theoretic principles as bridges between various cognitive functions and neural representations. Insights from information theoretic formalization have also led to tangible outcomes which have influenced the operation of artificial intelligent systems. One example is the information bottleneck (IB) approach, yielding insights on learning in neural networks (NN), as well as tools for slow feature analysis and speech recognition. A central application of the IB approach on NN, is through the view of data transfer between layers as an autoencoder. The approach then uses a variational approximation of the IB to produce an objective for minimization that is feasible and results in efficient training (a.k.a. variational IB(VIB)). In the other direction, the variational autoencoder (VAE) framework has also been used to explain cognitive functions, as done for example in. The IB approach has also been applied to emergent communication (EC) in both humans and machines, using a vector quantization VIB(VQ-VIB) method, that extends the aforementioned VIB method. Another example is the trade-off between information and value in the context of sequential decision making. This corresponding formalism has led to tangible methods in the solution of sequential decision making problems and was even used in an experimental study of mouse navigation and study of drivers' eye gaze patterns and study of drivers' language models. In aiming at understanding machine learning (ML), specifically in the context of NNs, or cognition, we need theoretical principles (hypotheses) that can be tested. To quote Shannon: I personally believe that many of the concepts of information theory will prove useful in these other fields-and, indeed, some results are already quite promising-but the establishing of such applications is not a trivial matter of translating words to a new domain, but rather the slow tedious process of hypothesis and experimental verification. If, for example, the human being acts in some situations like an ideal decoder, this is an experimental and not a mathematical fact, and as such must be tested under a wide variety of experimental situations. Today, both ML and cognition can entertain huge amounts of data. Establishing quantitative theories and corresponding methods for computation can have a massive impact on progress in these fields. Broadly, this workshop aims to further the understanding of information flow in cognitive processes and neural networks models of cognition. More concretely, this year’s workshop goals are twofold. On the one hand we wish to provide a fruitful platform for discussions relating to formulations of storage and processing of information either in human or artificial cognition systems, via information-theoretic measures, as those formalisms mentioned above. Specifically, the workshop comes to allow information theory researchers to take part in such discussions, allowing first-hand sharing of knowledge and ideas. On the other hand, we hope this workshop can advance, sharpen and enhance the research done around the computation of information theoretic quantities, specifically for the needs and benefits of cognition research. The two aims of the workshop are not independent of one another - any information theoretic formalism that we wish to experimentally verify has to be, in some sense, computationally feasible. Moreover, we wish that computation and estimation methods are developed in a way that is tailored to the open questions in human and artificial cognition. The proposed workshop focuses on bringing together researchers interested in integrating information-theoretic approaches with researchers focused on the computation/estimation of information-theoretic quantities, with the aim of tightening the collaboration between the two communities. Researchers interested in integrating information-theoretic approaches come from cognitive science, neuroscience, linguistics, economics, and beyond. Efforts in the computation/estimation of information-theoretic quantities are pursued for many reasons, and is a line of research gaining increasing attention due to advances in ML. Furthermore, these researchers have created in recent years new methods to measure information-related quantities.
Schedule
Fri 6:15 a.m. - 6:30 a.m.
|
Poster Organization
(
Hanging posters (please hang your poster)
)
>
|
🔗 |
Fri 6:30 a.m. - 6:40 a.m.
|
Opening Remarks
(
opening remarks
)
>
SlidesLive Video |
Noga Zaslavsky 🔗 |
Fri 6:40 a.m. - 7:10 a.m.
|
The Physics of Science
(
Invited talk
)
>
SlidesLive Video |
Karl Friston 🔗 |
Fri 7:10 a.m. - 7:20 a.m.
|
States as goal-directed concepts: an epistemic approach to state-representation learning
(
Oral
)
>
link
SlidesLive Video |
Nadav Amir · Yael Niv · Angela Langdon 🔗 |
Fri 7:20 a.m. - 7:50 a.m.
|
Human Information Processing in Complex Networks
(
Invited talk
)
>
SlidesLive Video |
Danielle S Bassett 🔗 |
Fri 7:50 a.m. - 8:00 a.m.
|
Discrete, compositional, and symbolic representations through attractor dynamics
(
Oral
)
>
link
SlidesLive Video |
Andrew Nam · Eric Elmoznino · Nikolay Malkin · Chen Sun · Yoshua Bengio · Guillaume Lajoie 🔗 |
Fri 8:00 a.m. - 8:30 a.m.
|
Coffee Break + posters
(
coffee break
)
>
|
🔗 |
Fri 8:30 a.m. - 9:00 a.m.
|
Resource-rational prediction in real and artificial neural networks
(
Invited talk
)
>
SlidesLive Video |
Sarah Marzen 🔗 |
Fri 9:00 a.m. - 9:10 a.m.
|
Lossy Compression and the Granularity of Causal Representation
(
Oral
)
>
link
SlidesLive Video |
David Kinney · Tania Lombrozo 🔗 |
Fri 9:10 a.m. - 9:40 a.m.
|
Information Theory for Representation Learning
(
Invited talk
)
>
SlidesLive Video |
Alemi 🔗 |
Fri 9:40 a.m. - 9:43 a.m.
|
What can AI Learn from Human Exploration? Intrinsically-Motivated Humans and Agents in Open-World Exploration
(
Spotlight
)
>
link
SlidesLive Video |
Alison Gopnik · Pieter Abbeel · Maria Rufova · Alyssa L Dayan · Eliza Kosoy · Yuqing Du 🔗 |
Fri 9:43 a.m. - 9:46 a.m.
|
Active Vision with Predictive Coding and Uncertainty Minimization
(
Spotlight
)
>
link
SlidesLive Video |
Abdelrahman Sharafeldin · Nabil Imam · Hannah Choi 🔗 |
Fri 9:46 a.m. - 9:49 a.m.
|
Natural Language Systematicity from a Constraint on Excess Entropy
(
Spotlight
)
>
link
SlidesLive Video |
Richard Futrell 🔗 |
Fri 9:49 a.m. - 9:52 a.m.
|
The Perception-Uncertainty Tradeoff in Generative Restoration Models
(
Spotlight
)
>
link
SlidesLive Video |
Regev Cohen · Ehud Rivlin · Daniel Freedman 🔗 |
Fri 9:52 a.m. - 9:55 a.m.
|
An Information-Theoretic Understanding of Maximum Manifold Capacity Representations
(
Spotlight
)
>
link
SlidesLive Video |
Rylan Schaeffer · Berivan Isik · Victor Lecomte · Mikail Khona · Yann LeCun · Andrey Gromov · Ravid Shwartz-Ziv · Sanmi Koyejo 🔗 |
Fri 9:55 a.m. - 9:58 a.m.
|
Cognitive Information Filters: Algorithmic Choice Architecture for Boundedly Rational Choosers
(
Spotlight
)
>
link
SlidesLive Video |
Stefan Bucher · Peter Dayan 🔗 |
Fri 10:00 a.m. - 11:30 a.m.
|
Lunch break
(
lunch break
)
>
|
🔗 |
Fri 11:30 a.m. - 12:00 p.m.
|
An information perspective on language, cumulative culture, and human uniqueness
(
Invited talk
)
>
SlidesLive Video |
Noah Goodman 🔗 |
Fri 12:00 p.m. - 12:10 p.m.
|
Information theoretic study of the neural geometry induced by category learning
(
Oral
)
>
link
SlidesLive Video |
Laurent BONNASSE-GAHOT · Jean-Pierre Nadal 🔗 |
Fri 12:10 p.m. - 12:40 p.m.
|
Clustering and phase transitions in self-attention dynamics
(
Invited talk
)
>
SlidesLive Video |
Yury Polyanskiy 🔗 |
Fri 12:40 p.m. - 1:30 p.m.
|
CDR: An Information-Theoretic Framework for Cognitive Dimension Reduction ( Poster ) > link | Maya Leshkowitz 🔗 |
Fri 12:40 p.m. - 1:30 p.m.
|
Balancing utility and cognitive cost in social representation ( Poster ) > link | Max Taylor-Davies · Christopher G Lucas 🔗 |
Fri 12:40 p.m. - 1:30 p.m.
|
A Work in Progress: Tighter Bounds on the Information Bottleneck for Deep Learning ( Poster ) > link | Nir Weingarten · Moshe Butman · Ran Gilad-Bachrach 🔗 |
Fri 12:40 p.m. - 1:30 p.m.
|
Finding Relevant Information in Saliency Related Neural Networks ( Poster ) > link | Ron M. Hecht · Gershon Celniker · Ronit Bustin · Dan Levi · Ariel Telpaz · Omer Tsimhoni · Ke Liu 🔗 |
Fri 12:40 p.m. - 1:30 p.m.
|
One if by land, two if by sea, three if by four seas, and more to come: values of perception, prediction, communication, and common sense in decision making ( Poster ) > link | Aolin Xu 🔗 |
Fri 12:40 p.m. - 1:30 p.m.
|
Information Flows Reveal Computational Mechanisms of RNNs in Contextual Decision-making ( Poster ) > link | Miles Mahon · Praveen Venkatesh 🔗 |
Fri 12:40 p.m. - 1:30 p.m.
|
The Distortion-Perception Tradeoff in Finite Channels with Arbitrary Distortion Measures ( Poster ) > link | Dror Freirich · Nir Weinberger · Ron Meir 🔗 |
Fri 12:40 p.m. - 1:30 p.m.
|
Decision confidence reflects maximum entropy reinforcement learning ( Poster ) > link | Amelia Johnson · Michael Buice · Koosha Khalvati 🔗 |
Fri 12:40 p.m. - 1:30 p.m.
|
Optimum Self-Random Generation Rate and Its Application to the Rate-Distortion-Perception-Problem ( Poster ) > link | Ryo Nomura 🔗 |
Fri 12:40 p.m. - 1:30 p.m.
|
Aberrant High-Order Dependencies in Schizophrenia Resting-State Functional MRI Networks ( Poster ) > link | Qiang Li · Vince Calhoun · Adithya Ram Ballem · Shujian Yu · Jesús Malo · Armin Iraji 🔗 |
Fri 12:40 p.m. - 1:30 p.m.
|
Influence of the geometry of the feature space on curiosity based exploration ( Poster ) > link | Grégoire Sergeant-Perthuis · Nils Ruet · David Rudrauf · Dimitri Ognibene · Yvain Tisserand 🔗 |
Fri 12:40 p.m. - 1:30 p.m.
|
Large Language Models Behave (Almost) As Rational Speech Actors: Insights From Metaphor Understanding ( Poster ) > link | Gaia Carenini · Louis Bodot · Luca Bischetti · Walter Schaeken · Valentina Bambini 🔗 |
Fri 12:40 p.m. - 1:30 p.m.
|
Empowerment, Free Energy Principle and Maximum Occupancy Principle Compared ( Poster ) > link | Ruben Moreno Bote · Jorge Ramirez Ruiz 🔗 |
Fri 12:40 p.m. - 1:30 p.m.
|
Practical estimation of ensemble accuracy ( Poster ) > link | Simi Haber · Yonatan Wexler 🔗 |
Fri 12:40 p.m. - 1:30 p.m.
|
Attention Schema in Neural Agents ( Poster ) > link | Dianbo Liu · Samuele Bolotta · Mike He Zhu · Zahra Sheikhbahaee · Yoshua Bengio · Guillaume Dumas 🔗 |
Fri 12:40 p.m. - 1:30 p.m.
|
Noisy Population Dynamics Lead to Efficiently Compressed Semantic Systems ( Poster ) > link | Nathaniel Imel · Noga Zaslavsky · Michael Franke · Richard Futrell 🔗 |
Fri 12:40 p.m. - 1:30 p.m.
|
On Complex Network Dynamics of an In-Vitro Neuronal System during Rest and Gameplay ( Poster ) > link | Moein Khajehnejad · Forough Habibollahi · Alon Loeffler · Brett J. Kagan · Adeel Razi 🔗 |
Fri 12:40 p.m. - 1:30 p.m.
|
Introducing an Improved Information-Theoretic Measure of Predictive Uncertainty ( Poster ) > link | Kajetan Schweighofer · Lukas Aichberger · Mykyta Ielanskyi · Sepp Hochreiter 🔗 |
Fri 12:40 p.m. - 1:30 p.m.
|
What can AI Learn from Human Exploration? Intrinsically-Motivated Humans and Agents in Open-World Exploration ( Poster ) > link | Yuqing Du · Eliza Kosoy · Alyssa L Dayan · Maria Rufova · Pieter Abbeel · Alison Gopnik 🔗 |
Fri 12:40 p.m. - 1:30 p.m.
|
Variable Selection in GPDMs Using the Information Bottleneck Method ( Poster ) > link | Jesse St. Amand · Martin Giese 🔗 |
Fri 12:40 p.m. - 1:30 p.m.
|
Information-Theoretic Generalization Error Bound of Deep Neural Networks ( Poster ) > link | Haiyun He · Christina Yu · Ziv Goldfeld 🔗 |
Fri 12:40 p.m. - 1:30 p.m.
|
Information theoretic study of the neural geometry induced by category learning ( Poster ) > link | Laurent BONNASSE-GAHOT · Jean-Pierre Nadal 🔗 |
Fri 12:40 p.m. - 1:30 p.m.
|
Lossy Compression and the Granularity of Causal Representation ( Poster ) > link | David Kinney · Tania Lombrozo 🔗 |
Fri 12:40 p.m. - 1:30 p.m.
|
The Perception-Uncertainty Tradeoff in Generative Restoration Models ( Poster ) > link | Regev Cohen · Ehud Rivlin · Daniel Freedman 🔗 |
Fri 12:40 p.m. - 1:30 p.m.
|
Cognitive Information Filters: Algorithmic Choice Architecture for Boundedly Rational Choosers ( Poster ) > link | Stefan Bucher · Peter Dayan 🔗 |
Fri 12:40 p.m. - 1:30 p.m.
|
Discrete, compositional, and symbolic representations through attractor dynamics ( Poster ) > link | Andrew Nam · Eric Elmoznino · Nikolay Malkin · Chen Sun · Yoshua Bengio · Guillaume Lajoie 🔗 |
Fri 12:40 p.m. - 1:30 p.m.
|
Active Vision with Predictive Coding and Uncertainty Minimization ( Poster ) > link | Abdelrahman Sharafeldin · Nabil Imam · Hannah Choi 🔗 |
Fri 12:40 p.m. - 1:30 p.m.
|
An Information-Theoretic Understanding of Maximum Manifold Capacity Representations ( Poster ) > link | Rylan Schaeffer · Berivan Isik · Victor Lecomte · Mikail Khona · Yann LeCun · Andrey Gromov · Ravid Shwartz-Ziv · Sanmi Koyejo 🔗 |
Fri 12:40 p.m. - 1:30 p.m.
|
States as goal-directed concepts: an epistemic approach to state-representation learning ( Poster ) > link | Nadav Amir · Yael Niv · Angela Langdon 🔗 |
Fri 12:40 p.m. - 1:30 p.m.
|
Natural Language Systematicity from a Constraint on Excess Entropy ( Poster ) > link | Richard Futrell 🔗 |
Fri 12:40 p.m. - 1:30 p.m.
|
Learning Causally Emergent Representations ( Poster ) > link | Christos Kaplanis · Pedro A.M Mediano · Fernando Rosas 🔗 |
Fri 12:40 p.m. - 1:30 p.m.
|
InfoCog Poster Session
(
poster session
)
>
|
🔗 |
Fri 1:50 p.m. - 2:00 p.m.
|
Information-Theoretic Generalization Error Bound of Deep Neural Networks ( Oral ) > link | Haiyun He · Christina Yu · Ziv Goldfeld 🔗 |
Fri 2:00 p.m. - 2:55 p.m.
|
Information theory, cognition, and deep learning: Challenges and opportunities
(
Panel discussion
)
>
SlidesLive Video |
Sarah Marzen · Stephan Mandt · Noah Goodman · Danielle S Bassett · Noga Zaslavsky · Rava Azeredo da Silveira · Ron M. Hecht · Ronit Bustin 🔗 |
Fri 3:00 p.m. - 3:30 p.m.
|
Poster Organization
(
Removing posters (please remove your poster)
)
>
|
🔗 |