Skip to main content Skip to navigation
Mathematical and Computational Neuroscience Lab

Research Interests

My main research interests involve the study of neural information processing, neural coding and information representation in biological systems. In particular I am interested in understanding information processing functions of neural ensemble activity and the biological mechanisms through which these functions are implemented. Quantitative tools that I apply to achieve my research goals come mostly from branches of applied probability (information theory, signal processing theory, multivariate statistics, stochastic differential equations), dynamical systems theory and group theory. Many other branches of mathematics and statistics – optimization, operations research, and differential geometry being the ones I used most recently – are applicable in particular stages throughout my research.  Furthermore, I hold the opinion that a biological system is understood sufficiently well when it can be modeled through an engineered artifact. Thus, my research has natural connections to applied engineering, through which information processing structures and functions of biological sensory systems can be translated into algorithms and signal processing devices.

My current research concentrates on three basic aspects related to these issues: developing analytical tools and quantitative approaches to characterizing the neural representation of sensory stimuli; and studying structure/function relations in biophysical models of neural systems; and implementing neural models in neuromorphic hardware (Loihi and BrainScaleS). These research directions are flexible and are easily adaptable to new collaborations and research environments.

Select Publications

  1. Parker, Dimitrov. Symmetry-Breaking Bifurcations of the Information Bottleneck and Related Problems. Entropy 24(9), 2022
  2. Dey and Dimitrov. Mapping and Validating a Point Neuron Model on Intel’s Neuromorphic Hardware Loihi, Frontiers in Neuroinformatics, 16, 2022.
  3. Dimitrov, Fekri, Lazar, Moser, Thomas. Guest Editorial: Biological Applications of Information Theory, in Honor of Claude Shannon’s Centennial Parts I, II, IEEE Transactions on Molecular, Biological, and Multi-scale Communications, 2, 2017
  4. Cummins, Dimitrov, Mayko, Portfors. Inhibition does not affect the timing code for vocalizations in the mouse auditory midbrain. Frontiers in Physiology, 5, doi: 10.3389/fphys.2014.00140, 2014
  5. Dimitrov, Aldworth. Timing information in insect mechanosensory systems. In Victor, DiLorenzo (eds), Temporal processing in nervous systems, Taylor & Francis, 2013
  6. Gedeon, Parker, Dimitrov. The mathematical structure of Information Bottleneck methods. Entropy, 14(3),456-479, 2012
  7. Dimitrov, Cummins, Baker, Aldworth. Characterizing the fine structure of a neural sensory code through Information Distortion. Journal of Computational Neuroscience, DOI: 10.1007/s10827-010-0261-4, 2010.
  8. Parker, Dimitrov, Gedeon. Symmetry breaking in soft clustering decoding of neural codes. IEEE Transactions on Information Theory (special issue on Molecular Biology and Neuroscience), 83(2),2010.
  9. Dimitrov, Gedeon. Effects of stimulus transformations on characteristics of sensory neuron function. Journal of Computational Neuroscience, 20:265-283, 2006
  10. Aldworth, Miller, Gedeon, Cummins, and Dimitrov. Dejittered Spike-Conditioned Stimulus Waveforms Yield Improved Estimates of Neuronal Feature Selectivity and Spike-Timing Precision of Sensory Interneurons. The Journal of Neuroscience, 25(22):5323-5332, 2005

Education

  • PhD, Applied Mathematics, The University of Chicago (1998)
  • MSc, Physics, The University of Chicago (1993)
  • BSc, Physics, University of Sofia, Bulgaria (1991)