Why to use BrainPy
BrainPy is an integrative framework for computational neuroscience and brain-inspired computation based on Just-In-Time (JIT) compilation (built on the top of JAX and Numba). Core functions provided in BrainPy includes
- JIT compilation for class objects.
- Numerical solvers for ODEs, SDEs, DDEs, FDEs, and others.
- Dynamics simulation tools for various brain objects, like neurons, synapses, networks, soma, dendrites, channels, and even more.
- Dynamics analysis tools for differential equations, including phase plane analysis and bifurcation analysis, continuation analysis and sensitive analysis.
- Seamless integration with deep learning models, and has the speed benefit on JIT compilation.
- And more ......
BrainPy is designed to effectively satisfy your basic requirements:
- Easy to learn and use: BrainPy is only based on Python language and has little dependency requirements.
- Flexible and transparent: BrainPy endows the users with the fully data/logic flow control. Users can code any logic they want with BrainPy.
- Extensible: BrainPy allow users to extend new functionality just based on Python coding. For example, we extend the numerical integration with the ability to do numerical analysis. In such a way, the same code in BrainPy can not only be used for simulation, but also for dynamics analysis.
- Efficient: All codes in BrainPy can be just-in-time compiled (based on JAX and Numba) to run on CPU or GPU devices, thus guaranteeing its running efficiency.
How to use BrainPy
Step 1: installation
BrainPy is based on Python (>=3.6), and the following packages are required to be installed to use
- NumPy >= 1.15
- Matplotlib >= 3.4
The installation details please see documentation: Quickstart/Installation
Method 1: install
BrainPy by using
To install the stable release of BrainPy (V1.0.3), please use
> pip install -U brain-py
To install the latest pre-release version of BrainPy (V1.1.0), please use
> pip install -U brain-py --pre
If you have installed the previous version of BrainPy, please uninstall the older one first
> pip uninstall brainpy-simulator
> pip install -U brain-py --pre
Method 2: install
BrainPy from source:
> pip install git+https://github.com/PKU-NIP-Lab/BrainPy
> pip install git+https://git.openi.org.cn/OpenI/BrainPy
> pip install -e git://github.com/PKU-NIP-Lab/BrainPy.git@V1.0.0
Other dependencies: you want to get the full supports by BrainPy, please install the following packages:
JAX >= 0.2.10, needed for "jax" backend and "dnn" module
Numba >= 0.52, needed for JIT compilation on "numpy" backend
SymPy >= 1.4, needed for dynamics "analysis" module and Exponential Euler method
Step 2: useful links
Step 3: inspirational examples
Here list several examples of BrainPy. More detailed examples and tutorials please see BrainModels.
- Leaky integrate-and-fire neuron model, source code
- Exponential integrate-and-fire neuron model, source code
- Quadratic integrate-and-fire neuron model, source code
- Adaptive Quadratic integrate-and-fire model, source code
- Adaptive Exponential integrate-and-fire model, source code
- Generalized integrate-and-fire model, source code
- Hodgkin–Huxley neuron model, source code
- Izhikevich neuron model, source code
- Morris-Lecar neuron model, source code
- Hindmarsh-Rose bursting neuron model, source code
See brainmodels.neurons to find more.
See brainmodels.synapses to find more.
- [CANN] (Si Wu, 2008) Continuous-attractor Neural Network
- (Vreeswijk & Sompolinsky, 1996) E/I balanced network
- (Sherman & Rinzel, 1992) Gap junction leads to anti-synchronization
- (Wang & Buzsáki, 1996) Gamma Oscillation
- (Brunel & Hakim, 1999) Fast Global Oscillation
- (Diesmann, et, al., 1999) Synfire Chains
- [Working Memory] (Mi, et. al., 2017) STP for Working Memory Capacity
- [Working Memory] (Bouchacourt & Buschman, 2019) Flexible Working Memory Model
- [Decision Making] (Wang, 2002) Decision making spiking model
- [Recurrent Network] (Laje & Buonomano, 2013) Robust Timing in RNN
- [Recurrent Network] (Sussillo & Abbott, 2009) FORCE Learning
Low-dimension dynamics analysis
Learning through back-propagation