Codage prédictif dans les réseaux de neurones biologiques

par Mirjana Maras

Projet de thèse en Neurosciences

Sous la direction de Sophie Denève.

Thèses en préparation à Paris Sciences et Lettres , dans le cadre de École doctorale Cerveau, cognition, comportement (Paris) , en partenariat avec Laboratoire de Neurosciences Cognitives (laboratoire) et de École normale supérieure (Paris ; 1985-....) (établissement de préparation de la thèse) depuis le 01-10-2015 .


  • Résumé

    << Je dois traduire en français >> One of the most exciting questions in the field of theoretical neuroscience is how much the coding of sensory information is grounded in the biophysical properties of neurons and how much in the dynamics of the network that the neurons are embedded in. Driven by the desire to answer this fundamental question, the goal of my Doctoral thesis is to develop a simplified but biophysically plausible computational model of a biological neural network. I am using a top-down approach that can nevertheless predict many aspects of neural responses, such as the balance between excitation and inhibition, asynchronous irregular spike trains, and spike-time dependent plasticity rules. This model was originally introduced in the predictive coding framework by my PhD supervisor, Dr. Denève. The predictive coding interpretation of neural activity supports the non-traditional view that most neural variability is not noise and is in fact a consequence of a coding efficiency principle. The local neuronal circuit minimizes the prediction error of its input by using an optimal number of spikes and maintaining a tight balance between excitation and inhibition. However, some aspects of this model are still highly unrealistic, such as the all-to-all connections and instantaneous synapses. In order to test the model's limits, both analytically and numerically, we have started to progressively incorporate more biological constraints into the model. For example, I have simulated a network of leaky integrate-and-fire neurons with sparse recurrent connections, and trained the associated weights using a local balance-restoring learning rule. The resulting network is more efficient than a network of independent Poisson neurons, but is less efficient than the all-to-all optimal network. We have also started to investigate the introduction of realistic synaptic delays and expect that this could improve the efficiency of the sparsely connected network.

  • Titre traduit

    Predictive coding in biological neural networks


  • Résumé

    One of the most exciting questions in the field of theoretical neuroscience is how much the coding of sensory information is grounded in the biophysical properties of neurons and how much in the dynamics of the network that the neurons are embedded in. Driven by the desire to answer this fundamental question, the goal of my Doctoral thesis is to develop a simplified but biophysically plausible computational model of a biological neural network. I am using a top-down approach that can nevertheless predict many aspects of neural responses, such as the balance between excitation and inhibition, asynchronous irregular spike trains, and spike-time dependent plasticity rules. This model was originally introduced in the predictive coding framework by my PhD supervisor, Dr. Denève. The predictive coding interpretation of neural activity supports the non-traditional view that most neural variability is not noise and is in fact a consequence of a coding efficiency principle. The local neuronal circuit minimizes the prediction error of its input by using an optimal number of spikes and maintaining a tight balance between excitation and inhibition. However, some aspects of this model are still highly unrealistic, such as the all-to-all connections and instantaneous synapses. In order to test the model's limits, both analytically and numerically, we have started to progressively incorporate more biological constraints into the model. For example, I have simulated a network of leaky integrate-and-fire neurons with sparse recurrent connections, and trained the associated weights using a local balance-restoring learning rule. The resulting network is more efficient than a network of independent Poisson neurons, but is less efficient than the all-to-all optimal network. We have also started to investigate the introduction of realistic synaptic delays and expect that this could improve the efficiency of the sparsely connected network.