Quick Start Guide#
From Zero to Building Neural Networks
Complete setup + first module in 15 minutes
Purpose: Get hands-on experience building ML systems in 15 minutes. Complete setup verification and build your first neural network component from scratch.
2-Minute Setup#
Let’s get you ready to build ML systems:
Step 1: One-Command Setup
# Clone repository
git clone https://github.com/mlsysbook/TinyTorch.git
cd TinyTorch
# Automated setup (handles everything!)
./setup-environment.sh
# Activate environment
source activate.sh
What this does:
Creates optimized virtual environment (arm64 on Apple Silicon)
Installs all dependencies (NumPy, Jupyter, Rich, PyTorch for validation)
Configures TinyTorch in development mode
Verifies installation
See Essential Commands for detailed workflow and troubleshooting.
Step 2: Verify Setup
# Run system diagnostics
tito system doctor
You should see all green checkmarks. This confirms your environment is ready for hands-on ML systems building.
See Essential Commands for verification commands and troubleshooting.
15-Minute First Module Walkthrough#
Let’s build your first neural network component following the TinyTorch workflow:
graph TD
Start[Clone & Setup] --> Edit[Edit Module<br/>tensor_dev.ipynb]
Edit --> Export[Export to Package<br/>tito module complete 01]
Export --> Test[Test Import<br/>from tinytorch import Tensor]
Test --> Next[Continue to Module 02]
style Start fill:#e3f2fd
style Edit fill:#fffbeb
style Export fill:#f0fdf4
style Test fill:#fef3c7
style Next fill:#f3e5f5
See Student Workflow for the complete development cycle.
Module 01: Tensor Foundations#
Learning Goal: Build N-dimensional arrays - the foundation of all neural networks
Time: 15 minutes
Action: Start with Module 01 to build tensor operations from scratch.
# Step 1: Edit the module source
cd modules/01_tensor
jupyter lab tensor_dev.ipynb
You’ll implement core tensor operations:
N-dimensional array creation
Basic mathematical operations (add, multiply, matmul)
Shape manipulation (reshape, transpose)
Memory layout understanding
Key Implementation: Build the Tensor class that forms the foundation of all neural networks
# Step 2: Export to package when ready
tito module complete 01
This makes your implementation importable: from tinytorch import Tensor
See Student Workflow for the complete edit → export → validate cycle.
Achievement Unlocked: Foundation capability - “Can I create and manipulate the building blocks of ML?”
Next Step: Module 02 - Activations#
Learning Goal: Add nonlinearity - the key to neural network intelligence
Time: 10 minutes
Action: Continue with Module 02 to add activation functions.
# Step 1: Edit the module
cd modules/02_activations
jupyter lab activations_dev.ipynb
You’ll implement essential activation functions:
ReLU (Rectified Linear Unit) - the workhorse of deep learning
Softmax - for probability distributions
Understand gradient flow and numerical stability
Learn why nonlinearity enables learning
Key Implementation: Build activation functions that allow neural networks to learn complex patterns
# Step 2: Export when ready
tito module complete 02
See Student Workflow for the complete edit → export → validate cycle.
Achievement Unlocked: Intelligence capability - “Can I add nonlinearity to enable learning?”
Track Your Progress#
After completing your first modules:
Check your new capabilities: Use the optional checkpoint system to track your progress:
tito checkpoint status # View your completion tracking
This is helpful for self-assessment but not required for the core workflow.
See Student Workflow for the essential edit → export → validate cycle.
Validate with Historical Milestones#
After exporting your modules, prove what you’ve built by running milestone scripts:
After Module 07: Build Rosenblatt’s 1957 Perceptron - the first trainable neural network
After Module 07: Solve the 1969 XOR Crisis with multi-layer networks
After Module 08: Achieve 95%+ accuracy on MNIST with 1986 backpropagation
After Module 09: Hit 75%+ on CIFAR-10 with 1998 CNNs
After Module 13: Generate text with 2017 Transformers
After Module 18: Optimize for production with 2018 Torch Olympics
See Journey Through ML History for complete timeline, requirements, and expected results.
The Workflow: Edit modules → Export with tito module complete N → Run milestone scripts to validate
See Student Workflow for the complete cycle.
What You Just Accomplished#
In 15 minutes, you’ve:
Setup Complete
Installed TinyTorch and verified your environment
Created Foundation
Implemented core tensor operations from scratch
First Capability
Earned your first ML systems capability checkpoint
Your Next Steps#
Immediate Next Actions (Choose One):#
Continue Building (Recommended): Begin Module 03 to add layers to your network.
Master the Workflow:
See Student Workflow for the complete edit → export → validate cycle
See Essential Commands for complete TITO command reference
See Student Workflow for the complete development cycle
For Instructors:
See Classroom Setup Guide for NBGrader integration (coming soon)
Pro Tips for Continued Success#
The TinyTorch Development Cycle:
Edit module sources in
modules/NN_name/(e.g.,modules/01_tensor/tensor_dev.ipynb)Export with
tito module complete NValidate by running milestone scripts
See Student Workflow for detailed workflow guide and best practices.
You’re Now a TinyTorch Builder#
Ready to Build Production ML Systems
You've proven you can build ML components from scratch. Time to keep going!
Continue Building → Master Commands →What makes TinyTorch different: You’re not just learning about neural networks—you’re building them from fundamental mathematical operations. Every line of code you write builds toward complete ML systems mastery.
Next milestone: After Module 08, you’ll train real neural networks on actual datasets using 100% your own code!