Comparative performance of mutual information and transfer entropy for analyzing the balance of information flow and energy consumption at synapses

Mireille Conrad, Renaud B Jolivet

Research output: Working paper / PreprintPreprint

Abstract

Information theory has become an essential tool of modern neuroscience. It can however be difficult to apply in experimental contexts when acquisition of very large datasets is prohibitive. Here, we compare the relative performance of two information theoretic measures, mutual information and transfer entropy, for the analysis of information flow and energetic consumption at synapses. We show that transfer entropy outperforms mutual information in terms of reliability of estimates for small datasets. However, we also show that a detailed understanding of the underlying neuronal biophysics is essential for properly interpreting the results obtained with transfer entropy. We conclude that when time and experimental conditions permit, mutual information might provide an easier to interpret alternative. Finally, we apply both measures to the study of energetic optimality of information flow at thalamic relay synapses in the visual pathway. We show that both measures recapitulate the experimental finding that these synapses are tuned to optimally balance information flowing through them with the energetic consumption associated with that synaptic and neuronal activity. Our results highlight the importance of conducting systematic computational studies prior to applying information theoretic tools to experimental data.
Original languageEnglish
DOIs
Publication statusPublished - 1 Jun 2020
Externally publishedYes

Keywords

  • neuroscience

Cite this