Explore projects
-
Title: An introduction to training algorithms for neuromorphic computing and on-line learning Abstract: Training neural networks implemented in neuromorphic hardware is challenging due to the dynamic, sparse, and local nature of the computations. This tutorial will describe some established gradient-based solutions to address these challenges in the context of real-valued recurrent neural networks and spiking neural networks. Insights into gradient-based training algorithms and associated autodifferentiation methods lead to online synaptic plasticity rules and the necessary assumptions to implement them in in-memory computing devices. The tutorial will conclude with methods that can be used to improve and optimize learning algorithms using meta-learning and other meta-optimization approaches.
Updated -
PGI15-Teaching / BICE24 RevealJS
MIT LicenseUpdated -
Ingo Meyer / config_update
MIT LicenseTool for synching config files between several hosts. It manages differences between systems with Git branches.
Updated -
Florian Rhiem / doc-utils
MIT LicenseRepository containing utilities for creating documentations.
Updated -
Tobias Ferrari / doc-utils
MIT LicenseRepository containing utilities for creating documentations.
Updated -
doc-utils / doc-utils
MIT LicenseRepository containing utilities for creating documentations.
Updated -
Updated
-
Gideon Müller / f2ch
MIT LicenseConvert Fortran modules to C-headers. Taken over from https://github.com/sharifmarat/fortran_to_c_headers
Updated -
-
Electron transport calculation code based on the real-space finite-difference formalism within the framework of the density functional theory
UpdatedUpdated -
Literature review on machine learning density functional theory (ML-DFT) and related topics
Updated -
Updated
-
Updated
-
Ingo Meyer / simple_c_logger
MIT LicenseUpdated -
Updated