Module: Layers#
โญโญ | โฑ๏ธ 4-5 hours
๐ Module Info#
Difficulty: โญโญ Intermediate
Time Estimate: 4-5 hours
Prerequisites: Tensor, Activations modules
Next Steps: Networks module
Build the fundamental transformations that compose into neural networks. This module teaches you that layers are simply functions that transform tensors, and neural networks are just sophisticated function composition using these building blocks.
๐ฏ Learning Objectives#
By the end of this module, you will be able to:
Understand layers as mathematical functions: Recognize that layers transform tensors through well-defined mathematical operations
Implement Dense layers: Build linear transformations using matrix multiplication and bias addition (
y = Wx + b
)Integrate activation functions: Combine linear layers with nonlinear activations to enable complex pattern learning
Compose simple building blocks: Chain layers together to create complete neural network architectures
Debug layer implementations: Use shape analysis and mathematical properties to verify correct implementation
๐ง Build โ Use โ Reflect#
This module follows TinyTorchโs Build โ Use โ Reflect framework:
Build: Implement Dense layers and activation functions from mathematical foundations
Use: Transform tensors through layer operations and see immediate results in various scenarios
Reflect: Understand how simple layers compose into complex neural networks and why architecture matters
๐ What Youโll Build#
Core Layer Implementation#
# Dense layer: fundamental building block
layer = Dense(input_size=3, output_size=2)
x = Tensor([[1.0, 2.0, 3.0]])
y = layer(x) # Shape transformation: (1, 3) โ (1, 2)
# With activation functions
relu = ReLU()
activated = relu(y) # Apply nonlinearity
# Chaining operations
layer1 = Dense(784, 128) # Image โ hidden
layer2 = Dense(128, 10) # Hidden โ classes
activation = ReLU()
# Forward pass composition
x = Tensor([[1.0, 2.0, 3.0, ...]]) # Input data
h1 = activation(layer1(x)) # First transformation
output = layer2(h1) # Final prediction
Dense Layer Implementation#
Mathematical foundation: Linear transformation
y = Wx + b
Weight initialization: Xavier/Glorot uniform initialization for stable gradients
Bias handling: Optional bias terms for translation invariance
Shape management: Automatic handling of batch dimensions and matrix operations
Activation Layer Integration#
ReLU integration: Most common activation for hidden layers
Sigmoid integration: Probability outputs for binary classification
Tanh integration: Zero-centered outputs for better optimization
Composition patterns: Standard ways to combine layers and activations
๐ Getting Started#
Prerequisites#
Ensure you have completed the foundational modules:
# Activate TinyTorch environment
source bin/activate-tinytorch.sh
# Verify prerequisite modules
tito test --module tensor
tito test --module activations
Development Workflow#
Open the development file:
modules/source/04_layers/layers_dev.py
Implement Dense layer class: Start with
__init__
andforward
methodsTest layer functionality: Use inline tests for immediate feedback
Add activation integration: Combine layers with activation functions
Build complete networks: Chain multiple layers together
Export and verify:
tito export --module layers && tito test --module layers
๐งช Testing Your Implementation#
Comprehensive Test Suite#
Run the full test suite to verify mathematical correctness:
# TinyTorch CLI (recommended)
tito test --module layers
# Direct pytest execution
python -m pytest tests/ -k layers -v
Test Coverage Areas#
โ Layer Functionality: Verify Dense layers perform correct linear transformations
โ Weight Initialization: Ensure proper weight initialization for training stability
โ Shape Preservation: Confirm layers handle batch dimensions correctly
โ Activation Integration: Test seamless combination with activation functions
โ Network Composition: Verify layers can be chained into complete networks
Inline Testing & Development#
The module includes educational feedback during development:
# Example inline test output
๐ฌ Unit Test: Dense layer functionality...
โ
Dense layer computes y = Wx + b correctly
โ
Weight initialization within expected range
โ
Output shape matches expected dimensions
๐ Progress: Dense Layer โ
# Integration testing
๐ฌ Unit Test: Layer composition...
โ
Multiple layers chain correctly
โ
Activations integrate seamlessly
๐ Progress: Layer Composition โ
Manual Testing Examples#
from tinytorch.core.tensor import Tensor
from layers_dev import Dense
from activations_dev import ReLU
# Test basic layer functionality
layer = Dense(input_size=3, output_size=2)
x = Tensor([[1.0, 2.0, 3.0]])
y = layer(x)
print(f"Input shape: {x.shape}, Output shape: {y.shape}")
# Test layer composition
layer1 = Dense(3, 4)
layer2 = Dense(4, 2)
relu = ReLU()
# Forward pass
h1 = relu(layer1(x))
output = layer2(h1)
print(f"Final output: {output.data}")
๐ฏ Key Concepts#
Real-World Applications#
Computer Vision: Dense layers process flattened image features in CNNs (like VGG, ResNet final layers)
Natural Language Processing: Dense layers transform word embeddings in transformers and RNNs
Recommendation Systems: Dense layers combine user and item features for preference prediction
Scientific Computing: Dense layers approximate complex functions in physics simulations and engineering
Mathematical Foundations#
Linear Transformation:
y = Wx + b
where W is the weight matrix and b is the bias vectorMatrix Multiplication: Efficient batch processing through vectorized operations
Weight Initialization: Xavier/Glorot initialization prevents vanishing/exploding gradients
Function Composition: Networks as nested function calls:
f3(f2(f1(x)))
Neural Network Building Blocks#
Modularity: Layers as reusable components that can be combined in different ways
Standardized Interface: All layers follow the same input/output pattern for easy composition
Shape Consistency: Automatic handling of batch dimensions and shape transformations
Nonlinearity: Activation functions between layers enable learning of complex patterns
Implementation Patterns#
Class-based Design: Layers as objects with state (weights) and behavior (forward pass)
Initialization Strategy: Proper weight initialization for stable training dynamics
Error Handling: Graceful handling of shape mismatches and invalid inputs
Testing Philosophy: Comprehensive testing of mathematical properties and edge cases
๐ Ready to Build?#
Youโre about to build the fundamental building blocks that power every neural network! Dense layers might seem simple, but theyโre the workhorses of deep learningโfrom the final layers of image classifiers to the core components of language models.
Understanding how these simple linear transformations compose into complex intelligence is one of the most beautiful insights in machine learning. Take your time, understand the mathematics, and enjoy building the foundation of artificial intelligence!
Choose your preferred way to engage with this module:
Run this module interactively in your browser. No installation required!
Use Google Colab for GPU access and cloud compute power.
Browse the Python source code and understand the implementation.
๐พ Save Your Progress
Binder sessions are temporary! Download your completed notebook when done, or switch to local development for persistent work.
Ready for serious development? โ ๐๏ธ Local Setup Guide