Master Theses / Projects at PGI-15

Master Thesis: Power Law Scaling of Foundational Models using Local Learning

The goal of this Master’s thesis is to explore the power law scaling (Cherti et al. 2023) of neuromorphic algorithms and hardware by replacing conventional global backpropagation with biologically inspired local learning rules. On a physical substrate, any computation is characterized by the set of variables available to the physical processing elements. If only a subset are available to the computational element, the computation is characterized as “local”. Doing so reduces the communication necessary for training (either literally or by reducing parameter syn- chronization in model parallel approaches), as well as making the learning rules compatible with a physical or neuromorphic implementation. Achieving high accuracy with local computations is a key challenge in neuromorphic hardware and other compute-in-memory architectures.

More

The goal of this Master’s thesis is to explore the power law scaling (Cherti et al. 2023) of neuromorphic algorithms and hardware by replacing conventional global backpropagation with biologically inspired local learning rules. On a physical substrate, any computation is characterized by the set of variables available to the physical processing elements. If only a subset are available to the computational element, the computation is characterized as “local”. Doing so reduces the communication necessary for training (either literally or by reducing parameter syn- chronization in model parallel approaches), as well as making the learning rules compatible with a physical or neuromorphic implementation. Achieving high accuracy with local computations is a key challenge in neuromorphic hardware and other compute-in-memory architectures.

More

Last Modified: 25.04.2024