SmolHub Playground!

SmolHub Playground!

Welcome to our experimental playground! Here you'll find small-scale models, proof-of-concepts, and fun AI experiments. Perfect for learning, testing ideas, and rapid prototyping.

Filter by tags:
Smol Mixtral

Smol Mixtral

A PyTorch implementation of a Mixtral inspired transformer model with Mixture of Experts (MoE), designed for text generation and understanding tasks. This model is built...
Smol Transformer

Smol Transformer

A compact implementation of an Encoder-Decoder Transformer for sequence-to-sequence translation tasks. This project implements a translation model from English to Hindi using the Samanantar dataset....
Story Kimi

Story Kimi

A PyTorch implementation of a DeepSeek V3 inspired transformer model with Mixture of Experts (MoE), Latent Attention, and other advanced features.
Story Llama

Story Llama

So, I trained a Llama a 88M architecture I coded from ground up to build a small instruct model, going through the below-mentioned stages from...
Story Mixtral

Story Mixtral

A PyTorch implementation of a Mixtral inspired transformer model with Mixture of Experts (MoE), Flash Attention, and other advanced features.
Smol Llama

Smol Llama

So, I trained a Llama a 130M architecture I coded from ground up to build a small instruct model, going through the below-mentioned stages from...