Building The Future With Technology
I believe technology should serve humanity. My goal is to create products that make a difference and bring good things to the world.
I believe technology should serve humanity. My goal is to create products that make a difference and bring good things to the world.
Ready to Play?
Jump over the obstacles!
Press Space or Click to Start
Take a little break while you're here
Impact & Growth
From concept to deployment
Across multiple platforms
In software development
Quality-driven results
An open-source transformer language model for the Burmese language, built from scratch with PyTorch during 2023 — back when ChatGPT was still on GPT-3.5 and building your own LLM was far from mainstream. One of the earliest efforts to bring large language model research to a low-resource Southeast Asian language, complete with a custom-curated Burmese Wikipedia dataset, a training pipeline, and a live demo on HuggingFace.
A GPT-style Japanese language model trained on anime and manga dialogues, built from scratch with PyTorch. Implements self-attention with Flash Attention support, top-k sampling, and model configurations scaling from 100M to 1B parameters. My very first deep learning project — inspired by Andrej Karpathy's nanoGPT at a time when building your own LLM from scratch was still a novel idea. This project laid the foundation for Burmese-GPT.
A full-stack AI-powered SaaS platform built for real estate professionals. Generates property descriptions, marketing content, and client communications using OpenAI, redraws floor plans and virtually stages vacant rooms through Replicate's image models, and includes an AI chat assistant, market analysis dashboard, and lead generation tools — all behind Stripe-powered subscription tiers. My first startup project, built during 2023 while simultaneously learning how LLMs worked from the inside out.
A diffusion-based image generator implementing Denoising Diffusion Probabilistic Models (DDPM) from scratch. Uses a UNet2D architecture with attention blocks, trains on CIFAR-10 with HuggingFace Accelerate for distributed training, and implements the full DDPM noise scheduling pipeline. Built to understand how diffusion models generate images from pure noise — the same fundamental technique behind Stable Diffusion and DALL-E.
A PyTorch-style deep learning framework written from scratch in Ruby. Implements tensors with gradient tracking, backpropagation, neural network layers (Linear, Conv2D, ReLU, Sigmoid, Tanh), an SGD optimizer, and loss functions (MSE, CrossEntropy). Built to deeply understand how frameworks like PyTorch work under the hood — by reimplementing one in a language with no existing deep learning ecosystem.
A lightweight CLI tool that analyzes git diffs and uses LLMs to suggest meaningful commit messages or answer questions about code changes. Built with Go and Cobra, with a pluggable AI provider system supporting Groq and OpenAI.
Let's discuss your project and explore how I can bring your vision to life with cutting-edge technology and exceptional design.