2.Character-Level GPT: Lightweight GPT-Style Transformer

Generative Modeling • Transformers • Tokenization 🔗 GitHub Repo

Built a character-level GPT model inspired by Karpathy’s nanoGPT to study tokenization, attention scaling, and autoregressive decoding on small datasets.

Reinforced understanding of GPT-style architectures, enabling experimentation with model scaling laws and inductive biases.