hmunachi
EE + Maths + Computing + Cognition
Funding Links: https://github.com/sponsors/HMUNACHI
- Name: Henry Ndubuaku
- Location: London
- Company: Cactus Compute
- Kind: user
- Followers: 95
- Following: 32
- Repositories count: 1
- Created at: 2023-03-22T12:22:48.131Z
- Updated at: 2025-05-08T18:44:17.361Z
- Last synced at: 2025-05-08T18:44:17.361Z
GitHub Sponsors Profile
AI Research Engineer who builds and shares open-source models, particularly through the NanoDL project.
Developing and training transformer-based models is typically resource-intensive and time-consuming and AI/ML experts frequently need to build smaller-scale versions of these models for specific problems. Jax, a low-resource yet powerful framework, accelerates the development of neural networks, but existing resources for transformer development in Jax are limited. NanoDL addresses this challenge with the following features:
A wide array of blocks and layers, facilitating the creation of customised transformer models from scratch.
An extensive selection of models like LlaMa2, Mistral, Mixtral, GPT3, GPT4 (inferred), T5, Whisper, ViT, Mixers, GAT, CLIP, and more, catering to a variety of tasks and applications.
Data-parallel distributed trainers so developers can efficiently train large-scale models on multiple GPUs or TPUs, without the need for manual training loops.
Dataloaders, making the process of data handling for Jax/Flax more straightforward and effective.
Custom layers not found in Flax/Jax, such as RoPE, GQA, MQA, and SWin attention, allowing for more flexible model development.
GPU/TPU-accelerated classical ML models like PCA, KMeans, Regression, Gaussian Processes etc., akin to SciKit Learn on GPU.
Modular design so users can blend elements from various models, such as GPT, Mixtral, and LlaMa2, to craft unique hybrid transformer models.
A range of advanced algorithms for NLP and computer vision tasks, such as Gaussian Blur, BLEU etc.
Each model is contained in a single file with no external dependencies, so the source code can also be easily used.
- Current Sponsors: 0
- Past Sponsors: 0
- Total Sponsors: 0
- Minimum Sponsorship: $1.00