SmolHub Playground!

SmolHub Playground! ๐Ÿš€

Welcome to our experimental playground! Here you'll find small-scale models, proof-of-concepts, and fun AI experiments. Perfect for learning, testing ideas, and rapid prototyping.

Filter by tags:
Smol Mixtral ๐ŸŽฎ

Smol Mixtral

A PyTorch implementation of a Mixtral inspired transformer model with Mixture of Experts (MoE), designed for text generation and understanding tasks. This model is built...
Smol Transformer ๐ŸŽฎ

Smol Transformer

A compact implementation of an Encoder-Decoder Transformer for sequence-to-sequence translation tasks. This project implements a translation model from English to Hindi using the Samanantar dataset....
Story Kimi ๐ŸŽฎ

Story Kimi

A PyTorch implementation of a DeepSeek V3 inspired transformer model with Mixture of Experts (MoE), Latent Attention, and other advanced features. ๐ŸŽฎ
Story Llama ๐ŸŽฎ

Story Llama

So, I trained a Llama a 88M architecture I coded from ground up to build a small instruct model, going through the below-mentioned stages from...
Story Mixtral ๐ŸŽฎ

Story Mixtral

A PyTorch implementation of a Mixtral inspired transformer model with Mixture of Experts (MoE), Flash Attention, and other advanced features. ๐ŸŽฎ
Smol Llama ๐ŸŽฎ

Smol Llama

So, I trained a Llama a 130M architecture I coded from ground up to build a small instruct model, going through the below-mentioned stages from...