Computational neuroscience

Computational Neuroscience (also theoretical neuroscience ) studies brain function in terms of the information processing properties of the structures That Make up the nervous system . [1] [2] It is an interdisciplinary computational science that links the various fields of neuroscience , cognitive science , and psychology with electrical engineering , computer science , mathematics , and physics .

Computational neuroscience emphasizes descriptions of functional and biologically realistic neurons (and neural systems) and their physiology and dynamics, distinguishing them from psychological connectionism and disciplines such as machine learning , neural networks , and computational learning theory .

These models are useful in the context of multiple spatial-temporal scales, from membrane currents, proteins, and chemical coupling to network oscillations , columnar and topographic architecture, and learning and memory. Furthermore, these computational models frame hypotheses that can be directly tested by biological or psychological experiments.

History

The term “computational neuroscience” was introduced by Eric L. Schwartz , who organized a conference, held in 1985 in Carmel, California , at the request of the Systems Development Foundation to provide a summary of current status. Neural modeling, brain theory and neural networks. The proceedings of this definitional meeting were published in 1990 as Computational Neuroscience . [3] The first open international meeting focused on Computational Neuroscience was organized by James M. Bower and John Miller in San Francisco, Californiain 1989 CNS meeting. [4] The first graduate program in computational neuroscience was organized as a Computational and Neural Systems Ph.D. program at the California Institute of Technology in 1985.

The early historical roots of the field can be traced to the work of Louis Lapicque , Hodgkin & Huxley , Hubel & Wiesel , and David Marr , to name a few. Introduced the Lapicque integrate and fire model of the neuron in a seminal paper published in 1907 [5] This model is still popular today for mathematical, biological, and artificial neural networks studies icts Because of simplicity (see a recent review [6] ) and, in fact, has seen recent experimental and biophysical support. [7] [8]

About 40 years later, Hodgkin & Huxley developed the voltage clamp and created the first biophysical model of the action potential . Hubel & Wiesel discovered that neurons in the primary visual cortex , the first cortical area to process information coming from the retina , have oriented receptive fields and are organized in columns. [9] David Marr’s work focused on the interactions between neurons, suggesting computational approaches to the study of functional neuronal groups in the hippocampus and neocortexinteract, store, process, and transmit information. Computational modeling of biophysically realistic neurons and dendrites began with the work of Wilfrid Rall , with the first multicomponent model using cable theory .

Major topics

Research in computational neuroscience can be roughly categorized into several lines of inquiry. Most computational neuroscientists collaborate closely with experimentalists in analyzing novel data and synthesizing new models of biological phenomena.

Single-neuron modeling

Main article: Biological neuron models

Even single neurons have complex biophysical characteristics and can perform computations (eg [10] ). Hodgkin and Huxley’s original model only uses two voltage-sensitive currents, the fast-acting sodium and the inward-rectifying potassium. . Successful in predicting the timing and qualitative features of the action potential, it nevertheless failed to predict a significant number of features such as adaptation and shunting. Scientists now believe that there is a wide variety of voltage-sensitive currents, and the implications of the differing dynamics, modulations, and sensitivity of these currents is an important topic of computational neuroscience. [11]

The computational functions of complex dendrites are also under intense investigation. There is a large body of literature regarding how different currents interact with geometric properties of neurons. [12]

Some models are also tracking biochemical pathways at such small levels as spines or synaptic clefts.

There are many software packages, such as GENESIS and NEURON , which allow rapid and systematic in silico modeling of realistic neurons. Blue Brain , a project founded by Henry Markram from the Federal Polytechnic School of Lausanne , aims to build a biophysically detailed simulation of a cortical column on the Blue Gene supercomputer .

Modeling the richness of biophysical properties on the single-neuron scale can provide mechanisms for the building blocks for network dynamics. [13] However, detailed neuron descriptions are computationally expensive and this can be the pursuit of realistic network investigations, where many neurons need to be simulated. As a result, this study is broadly neural circuits typically represent each neuron and synapse with an artificially simple model, ignoring much of the biological detail. Hence there is a simple way to reduce the complexity of this problem. Algorithms have been developed to produce faster, faster running, simplified neuron models from computationally expensive, detailed neuron models.[14]

Development, axonal patterning, and guidance

Computational neuroscience aims to address a wide array of issues. How do axons and dendrites form during development? How do you know where to reach and reach these targets? How do neurons migrate to the proper position in the central and peripheral systems? How do synapses form? We know from molecular biology that distinct parts of the nervous system separate chemical cues, from growth factors to hormones that modulate and influence the growth and development of functional connections between neurons.

Theoretical investigations into the formation and patterning of synaptic connection and morphology are still nascent. One hypothesis that has been garnered some attention is the minimal wiring hypothesis , which postulates that the formation of axons and dendrites effectively minimizes resource allocation while maintaining maximum information storage. [15]

Sensory processing

Early models of sensory processing Understood Within a theoretical framework are credited to Horace Barlow . Somewhat similar to the minimal wiring hypothesis described in the preceding section, Barlow understood the processing of early sensory systems to be a form of efficient coding , where the neurons encoded information which minimizes the number of spikes. Experimental and computational work on this hypothesis in one form or another.

Current research in sensory processing is divided between a biophysical modeling of different subsystems and a more theoretical modeling of perception. Current models of perception have suggested that the brain performs some form of Bayesian inference and integration of different sensory information in the supply of perception of the physical world. [16] [17]

Memory and synaptic plasticity

Main article: Synaptic plasticity

Earlier models of memory are primarily based on the postulates of Hebbian learning . Biologically relevant models Such As net Hopfield -have-been Developed to address the properties of associative Rather than content-addressable style of memory That Occur in biological systems. These attempts are primarily focusing on the formation of medium-and- long-term memory , localizing in the hippocampus . Models of working memory , relying on theories of network oscillations and persistence, have been built to capture some features of the prefrontal cortex in context-related memory. [18]

One of the major problems in neurophysiological memory is how it is maintained and changed through multiple time scales. Unstable synapses are easy to train but also prevent stochastic disruption. Stable synapses are less easily, but they are also harder to consolidate. One recent computational hypothesis involves cascades of plasticity that allow synapses to function at multiple time scales. [19] Stereochemically detailed models of the acetylcholine receptor -based synapse with the Monte Carlo method , working at the time scale of microseconds, have been built. [20] It is likely that computational tools will contribute greatly to our understanding of how synapses function and change in relation to external stimulus in the coming decades.

Behaviors of networks

Biological neurons are connected to each other in a complex, recurrent fashion. These connections are, unlike most artificial neural networks , sparse and usually specific. It is understood that information is transmitted through such sparsely-connected networks, the specific areas of the brain, such as the visual cortex , are understood in some detail. [21] It is also unknown what the computational functions of these specific connectivity patterns are, if any.

The interactions of neurons in a small network can be reduced to such models as the Ising model . The statistical mechanics of such simple systems are well-characterized theoretically. There has been some recent evidence that suggests that the dynamics of arbitrary neuronal networks can be reduced to pairwise interactions. [22] It is not known, however, that such descriptive dynamics impart any important computational function. With the emergence of two-photon microscopy and calcium imaging , we now have powerful experimental methods with which to test the new theories regarding neuronal networks.

In some cases the complex interactions between inhibitory and excitatory neurons can be simplified using mean field theory , which gives rise to the population model of neural networks. While many neurotheorists prefer such models with reduced complexity, others argue that uncovering structural functional relationships depend on much neuronal and network structure as possible. Models of this type are typically built in large simulation platforms like GENESIS or NEURON. There have been some attempts to provide unified methods that bridge these complexity levels. [23]

Cognition, discrimination, and learning

Computational modeling of higher cognitive functions has only recently when? ] begun. Experimental data comes mainly from single-unit recording in primates . The frontal lobe and parietal lobe function as integrators of information from multiple sensory modalities. There are some tentative ideas regarding how these mutually inhibitory functional circuits in these areas can carry out biologically relevant computation. [24]

The brain seems to be able to discriminate and adapt well in certain contexts. For instance, human beings seem to have an enormous capacity for memorizing and recognizing faces. One of the key goals of computational neuroscience is to dissect how biological systems carry these complex systems efficiently and potentially replicate these processes in building intelligent machines.

The brain’s broad-scale organizational principles are illuminated by many fields, including biology, psychology, and clinical practice. Neuroscience, neuroscience . These are the bases for some quantitative modeling of large-scale brain activity. [25]

The Computational Representational Understanding of Mind ( CRUM ) is another attempt at modeling human cognition through simulated processes in decision making and the manipulation of visual representations in decision making.

Consciousness

One of the ultimate goals of psychology / neuroscience is to be able to explain the everyday experience of conscious life. Francis Crick and Christof Koch made some attempts to formulate a framework for future work in neural correlates of consciousness (NCC), though much of the work in this field remains speculative. [26]

Computational clinical neuroscience

It is a field that brings together experts in neuroscience, neurology , psychiatry , decision making and computational modeling to investigate and investigate problems in neurological and psychiatric diseases , and to train scientists and clinicians. [27] [28]

Notable persons

  • Phil Husbands , Professor of Computer Science and Artificial Intelligence at the English University of Sussex
  • Read Montague , American neuroscientist and popular science author
  • Tomaso Poggio , Eugene McDermott Professor in the Department of Brain and Cognitive Sciences, investigator at the McGovern Institute for Brain Research , a member of the MIT Computer Science and Artificial Intelligence Laboratory (CSAIL) and director of both the Center for Biological and Computational Learning at MIT and the Center for Brains, Minds, and Machines.
  • Terry Sejnowski , Investigator at the Howard Hughes Medical Institute and Francis Crick Professor at The Salk Institute for Biological Studies where he directs the Computational Neurobiology Laboratory
  • Haim Sompolinsky , William N. Skirball Professor of Neuroscience at the Edmond and Lily Safra Center for Brain Sciences (formerly the Interdisciplinary Center for Neural Computation), and a Professor of Physics at the Racah Institute of Physics at The Hebrew University of Jerusalem , Israel.

See also[edit]

  • Biological neuron models
  • Bayesian Brain
  • Brain-computer interface
  • Brain simulation
  • Computational anatomy
  • connectionism
  • Medical image computing
  • Mind uploading
  • Neural coding
  • Neural engineering
  • Neural network
  • Neurocomputational speech processing
  • Neuroinformatics
  • Simulated reality
  • Artificial consciousness
  • Cognitive architecture
  • Technological singularity , a hypothetical artificial intelligence that would exceed the capabilities of the human brain

Notes and references

  1. Jump up^ What is computational neuroscience? Patricia S. Churchland, Christof Koch, Terrence J. Sejnowski. in Computational Neuroscience pp.46-55. Edited by Eric L. Schwartz. 1993. MIT Press[1]
  2. Jump up^ Gerstner, W .; Kistler, W .; Naud, R .; Paninski, L. (2014). Neuronal Dynamics . Cambridge, UK: Cambridge University Press. ISBN  9781107447615 .
  3. Jump up^ Schwartz, Eric (1990). Computational neuroscience . Cambridge, Mass: MIT Press. ISBN  0-262-19291-8 .
  4. Jump up^ Bower, James M. (2013). 20 years of Computational Neuroscience . Berlin, Germany: Springer. ISBN  978-1461414230 .
  5. Jump up^ Lapicque L (1907). “Quantitative research on electrical excitation of nerves treated as a polarization”. J. Physiol. Pathol. Gen . 9 : 620-635.
  6. Jump up^ Brunel N, Van Rossum MC (2007). “Lapicque’s 1907 paper: from frogs to integrate-and-fire”. Biol. Cybern . 97 (5-6): 337-339. doi : 10.1007 / s00422-007-0190-0 . PMID  17968583 .
  7. Jump up^ Singh, Chandan; Levy, William B. (2017-07-13). “A consensus layer V pyramidal neuron can sustain interpulse-interval coding” . PLOS ONE . 12 (7): e0180839. doi : 10.1371 / journal.pone.0180839 . ISSN  1932-6203 .
  8. Jump up^ Cash, Sydney; Yuste, Rafael (1999-02-01). “Linear Summation of Excitatory Inputs by CA1 Pyramidal Neurons” . Neuron . 22 (2). doi :10.1016 / S0896-6273 (00) 81098-3 . ISSN  0896-6273 .
  9. Jump up^ Hubel DH, Wiesel TN (1962). “Receptive fields, binocular interaction and functional architecture in the cat’s visual cortex” . J. Physiol . 160 (1): 106-54. doi : 10.1113 / jphysiol.1962.sp006837 . PMC  1359523  . PMID  14449617 .
  10. Jump up^ Forrest MD (2014). “Intracellular Calcium Dynamics Permits a Purkinje Neuron Model to Perform Toggle and Gain Computations Upon Its Inputs” . Frontiers in Computational Neuroscience . 8 : 86. doi : 10.3389 / fncom.2014.00086 . PMC  4138505  . PMID  25191262 .
  11. Jump up^ Wu, Samuel Miao-sin; Johnston, Daniel (1995). Foundations of cellular neurophysiology . Cambridge, Mass: MIT Press. ISBN  0-262-10053-3 .
  12. Jump up^ Koch, Christof (1999). Biophysics of computation: information processing in single neurons . Oxford [Oxfordshire]: Oxford University Press. ISBN  0-19-510491-9 .
  13. Jump up^ Forrest MD (2014). “Intracellular Calcium Dynamics Permits a Purkinje Neuron Model to Perform Toggle and Gain Computations Upon Its Inputs” . Frontiers in Computational Neuroscience . 8 : 86. doi : 10.3389 / fncom.2014.00086 . PMC  4138505  . PMID  25191262 .
  14. Jump up^ Forrest MD (April 2015). “Simulation of alcohol action on a pure model that runs> 400 times faster” . BMC Neuroscience . 16 (27). doi :10.1186 / s12868-015-0162-6 .
  15. Jump up^ Chklovskii DB, BW Mel, Svoboda K (October 2004). “Cortical rewiring and information storage”. Nature . 431 (7010): 782-8. Bibcode :2004Natur.431..782C . doi : 10.1038 / nature03012 . PMID  15483599 . Review article
  16. Jump up^ Weiss, Yair; Simoncelli, Eero P .; Adelson, Edward H. (May 20, 2002). “Motion illusions as optimal percepts”. Nature Neuroscience . 5 (6): 598-604. doi : 10.1038 / nn0602-858 .
  17. Jump up^ Ernst, Marc O .; Bülthoff, Heinrich H. (April 2004). “Merging the senses into a robust percept”. Trends in Cognitive Sciences . 8 (4): 162-169. doi :10.1016 / j.tics.2004.02.002 .
  18. Jump up^ Durstewitz D Seamans JK, TJ Sejnowski (2000). “Neurocomputational models of working memory”. Nat. Neurosci . 3 (Suppl): 1184-91. doi :10.1038 / 81460 . PMID  11127836 .
  19. Jump up^ Fusi S, Drew PJ, Abbott LF (2005). “Waterfall models of synaptically stored memories”. Neuron . 45 (4): 599-611. doi : 10.1016 / j.neuron.2005.02.001 . PMID  15721245 .
  20. Jump up^ Coggan JS, Bartol TM, Esquenazi E, et al. (2005). “Evidence for ectopic neurotransmission at a neuronal synapse” . Science . 309 (5733): 446-51. Bibcode : 2005Sci … 309..446C . doi : 10.1126 / science.1108239. PMC  2915764  . PMID  16020730 .
  21. Jump up^ Olshausen, Bruno A .; Field, David J. (1997-12-01). “Sparse coding with an overcomplete basis set: A strategy employed by V1?” . Vision Research . 37 (23): 3311-3325. doi : 10.1016 / S0042-6989 (97) 00169-7.
  22. Jump up^ Schneidman E, MJ Berry, Segev R, Bialek W (2006). “Weak pairwise correlations imply strongly correlated network states in a neural population” . Nature . 440 (7087): 1007-12. arXiv : q-bio / 0512013  . Bibcode : 2006Natur.440.1007S . doi : 10.1038 / nature04701 . PMC  1785327  . PMID  16625187 .
  23. Jump up^ Anderson, Charles H .; Eliasmith, Chris (2004). Neural Engineering: Computation, Representation, and Dynamics in Neurobiological Systems (Computational Neuroscience) . Cambridge, Mass: The MIT Press. ISBN  0-262-55060-1 .
  24. Jump up^ CK Machens, Romo R, Brody CD (2005). “Flexible control of mutual inhibition: a neural model of two-interval discrimination”. Science . 307(5712): 1121-4. Bibcode : 2005Sci … 307.1121M . doi : 10.1126 / science.1104171 . PMID  15718474 .
  25. Jump up^ Robinson PA, Rennie CJ, DL Rowe, O’Connor SC, Gordon E (2005). Multiscale brain modeling . Philosophical Transactions of the Royal Society B . 360 (1457): 1043-1050. doi : 10.1098 / rstb.2005.1638 . PMC  1854922  . PMID  16087447 .
  26. Jump up^ Crick F, Koch C (2003). “A framework for consciousness” . Nat. Neurosci . 6 (2): 119-26. doi : 10.1038 / nn0203-119 . PMID  12555104.
  27. Jump up^ Adaszewski S1, Dukart J, Kherif F, Frackowiak R, Draganski B; Alzheimer’s Disease Neuroimaging Initiative (2013). “How can we predict Alzheimer’s disease using computational anatomy?”. Neurobiol Aging . 34(12): 2815-26. doi : 10.1016 / j.neurobiolaging.2013.06.015 .
  28. Jump up^ Friston KJ, Stephan KE, Montague R, Dolan RJ (2014). “Computational psychiatry: the brain as a phantastic organ”. Lancet Psychiatry . 1 (2): 148-58. doi : 10.1016 / S2215-0366 (14) 70275-5 .