Skip to content

Latest commit

 

History

History
119 lines (77 loc) · 7.05 KB

citation.rst

File metadata and controls

119 lines (77 loc) · 7.05 KB

Citations

If our brain dynamics programming ecosystem has been significant in your research, and you would like to acknowledge the project in your academic publication, we suggest citing the following papers.

BrainPy

If you are using BrainPy=2.x, please use the following citation:

  • Wang C, Zhang T, Chen X, et al. BrainPy, a flexible, integrative, efficient, and extensible framework for general-purpose brain dynamics programming[J]. elife, 2023, 12: e86365.
  • Wang C, Zhang T, He S, et al. A differentiable brain simulator bridging brain simulation and brain-inspired computing[C]//The Twelfth International Conference on Learning Representations. 2024.
 @article {wang2023brainpy,
     article_type = {journal},
     title = {BrainPy, a flexible, integrative, efficient, and extensible framework for general-purpose brain dynamics programming},
     author = {Wang, Chaoming and Zhang, Tianqiu and Chen, Xiaoyu and He, Sichao and Li, Shangyang and Wu, Si},
     editor = {Stimberg, Marcel},
     volume = 12,
     year = 2023,
     month = {dec},
     pub_date = {2023-12-22},
     pages = {e86365},
     citation = {eLife 2023;12:e86365},
     doi = {10.7554/eLife.86365},
     url = {https://doi.org/10.7554/eLife.86365},
     journal = {eLife},
     issn = {2050-084X},
     publisher = {eLife Sciences Publications, Ltd},
 }


 @inproceedings{wang2024brainpy,
    title={A differentiable brain simulator bridging brain simulation and brain-inspired computing},
    author={Wang, Chaoming and Zhang, Tianqiu and He, Sichao and Gu, Hongyaoxing and Li, Shangyang and Wu, Si},
    booktitle={The Twelfth International Conference on Learning Representations},
    year={2024}
}

If you are using BrainPy=1.x, please use the following citation:

@inproceedings{wang2021just,
  title={A Just-In-Time Compilation Approach for Neural Dynamics Simulation},
  author={Wang, Chaoming and Jiang, Yingqian and Liu, Xinyu and Lin, Xiaohan and Zou, Xiaolong and Ji, Zilong and Wu, Si},
  booktitle={International Conference on Neural Information Processing},
  pages={15--26},
  year={2021},
  organization={Springer}
}

brainunit

If you are using brainunit, please use the following citation:

  • Wang C, He S, Luo S, et al. BrainUnit: Integrating Physical Units into High-Performance AI-Driven Scientific Computing[J]. bioRxiv, 2024: 2024.09. 20.614111.
@article {Wang2024brainunit,
     author = {Wang, Chaoming and He, Sichao and Luo, Shouwei and Huan, Yuxiang and Wu, Si},
     title = {BrainUnit: Integrating Physical Units into High-Performance AI-Driven Scientific Computing},
     elocation-id = {2024.09.20.614111},
     year = {2024},
     doi = {10.1101/2024.09.20.614111},
     publisher = {Cold Spring Harbor Laboratory},
     abstract = {Artificial intelligence (AI) is revolutionizing scientific research across various disciplines. The foundation of scientific research lies in rigorous scientific computing based on standardized physical units. However, current mainstream high-performance numerical computing libraries for AI generally lack native support for physical units, significantly impeding the integration of AI methodologies into scientific research. To fill this gap, we introduce BrainUnit, a unit system designed to seamlessly integrate physical units into AI libraries, with a focus on compatibility with JAX. BrainUnit offers a comprehensive library of over 2000 physical units and more than 300 unit-aware mathematical functions. It is fully compatible with JAX transformations, allowing for automatic differentiation, just-in-time compilation, vectorization, and parallelization while maintaining unit consistency. We demonstrate BrainUnit{\textquoteright}s efficacy through several use cases in brain dynamics modeling, including detailed biophysical neuron simulations, multiscale brain network modeling, neuronal activity fitting, and cognitive task training. Our results show that BrainUnit enhances the accuracy, reliability, and interpretability of scientific computations across scales, from ion channels to whole-brain networks, without significantly impacting performance. By bridging the gap between abstract computational frameworks and physical units, BrainUnit represents a crucial step towards more robust and physically grounded AI-driven scientific computing.Competing Interest StatementThe authors have declared no competing interest.},
     URL = {https://www.biorxiv.org/content/early/2024/09/22/2024.09.20.614111},
     eprint = {https://www.biorxiv.org/content/early/2024/09/22/2024.09.20.614111.full.pdf},
     journal = {bioRxiv}
}

brainscale

If you are using brainscale, please use the following citation:

  • Wang C, Dong X, Jiang J, et al. BrainScale: Enabling scalable online learning in spiking neural networks[J]. bioRxiv, 2024: 2024.09. 24.614728.
@article {Wang2024brainscale,
     author = {Wang, Chaoming and Dong, Xingsi and Jiang, Jiedong and Ji, Zilong and Liu, Xiao and Wu, Si},
     title = {BrainScale: Enabling Scalable Online Learning in Spiking Neural Networks},
     elocation-id = {2024.09.24.614728},
     year = {2024},
     doi = {10.1101/2024.09.24.614728},
     publisher = {Cold Spring Harbor Laboratory},
     abstract = {Whole-brain simulation stands as one of the most ambitious endeavors of our time, yet it remains constrained by significant technical challenges. A critical obstacle in this pursuit is the absence of a scalable online learning framework capable of supporting the efficient training of complex, diverse, and large-scale spiking neural networks (SNNs). To address this limitation, we introduce BrainScale, a framework specifically designed to enable scalable online learning in SNNs. BrainScale achieves three key advancements for scalability. (1) Model diversity: BrainScale accommodates the complex dynamics of brain function by supporting a wide spectrum of SNNs through a streamlined abstraction of synaptic interactions. (2) Efficient scaling: Leveraging SNN intrinsic characteristics, BrainScale achieves an online learning algorithm with linear memory complexity. (3) User-friendly programming: BrainScale provides a programming environment that automates the derivation and execution of online learning computations for any user-defined models. Our comprehensive evaluations demonstrate BrainScale{\textquoteright}s efficiency and robustness, showing a hundred-fold improvement in memory utilization and several-fold acceleration in training speed while maintaining performance on long-term dependency tasks and neuromorphic datasets. These results suggest that BrainScale represents a crucial step towards brain-scale SNN training and whole-brain simulation.Competing Interest StatementThe authors have declared no competing interest.},
     URL = {https://www.biorxiv.org/content/early/2024/09/24/2024.09.24.614728},
     eprint = {https://www.biorxiv.org/content/early/2024/09/24/2024.09.24.614728.full.pdf},
     journal = {bioRxiv}
}