๐ง Finally understand how PyTorch really works
Build your own deep learning framework from scratch
# What's actually happening here? ๐คทโโ๏ธ
loss.backward() # Magic?
optimizer.step() # More magic?
You're not alone. Most ML students and engineers use deep learning frameworks without understanding the internals. That's where TensorWeaver comes in.
TensorWeaver is the educational deep learning framework that shows you exactly how PyTorch works under the hood. Built from scratch in pure Python, it demystifies automatic differentiation, neural networks, and optimization algorithms.
Think of it as "PyTorch with the hood open" ๐ง
- CS Students learning machine learning internals
- Self-taught developers who want to go beyond tutorials
- ML Engineers debugging complex gradient issues
- Educators teaching deep learning concepts
- Curious minds who refuse to accept "magic"
๐ก Pro Tip: Use
import tensorweaver as torch
for seamless PyTorch compatibility!
pip install tensorweaver
import tensorweaver as torch # PyTorch-compatible API!
# Build a neural network (just like PyTorch!)
class SimpleModel(torch.nn.Module):
def __init__(self):
super().__init__()
self.linear1 = torch.nn.Linear(784, 128)
self.relu = torch.nn.ReLU()
self.linear2 = torch.nn.Linear(128, 10)
def forward(self, x):
x = self.relu(self.linear1(x))
return self.linear2(x)
model = SimpleModel()
# Train it
loss_fn = torch.nn.MSELoss()
optimizer = torch.optim.SGD(model.parameters(), lr=0.01)
# The difference? You can see EXACTLY what happens inside! ๐
๐ Try it live in your browser โ
|
|
๐ญ Industrial Frameworks | ๐ TensorWeaver |
---|---|
โ Complex C++ codebase | โ Pure Python - readable by humans |
โ Optimized for performance | โ Optimized for learning |
โ "Trust us, it works" | โ "Here's exactly how it works" |
โ Intimidating for beginners | โ Designed for education |
- ๐ Transparent Implementation: Every operation is visible and understandable
- ๐ Pure Python: No hidden C++ complexity - just NumPy and Python
- ๐ฏ PyTorch-Compatible API: Same interface, easier transition
- ๐ Educational Focus: Built for learning, not just using
- ๐งช Complete Functionality: Autodiff, neural networks, optimizers, ONNX export
- ๐ Growing Documentation: Clear explanations with working examples
- Tensor Basics - Understanding tensors and operations
- Linear Regression - Your first neural network
- Automatic Differentiation - How gradients are computed (coming soon)
- Multi-layer Networks - Building deeper models
- Loss Functions & Optimizers - Training dynamics (coming soon)
- Model Export - ONNX export and deployment
- Custom Operators - Extending the framework (coming soon)
- Performance Optimization - Making it faster (coming soon)
- GPU Support - Scaling computations (in development)
๐ Note: Some documentation links are still in development. Check our milestones for working examples!
๐ฌ See Automatic Differentiation in Action
import tensorweaver as torch
# Create tensors
x = torch.tensor([2.0])
y = torch.tensor([3.0])
# Forward pass
z = x * y + x**2
print(f"z = {z.data}") # [10.0]
# Backward pass - see the magic!
z.backward()
print(f"dz/dx = {x.grad}") # [7.0] = y + 2*x = 3 + 4
print(f"dz/dy = {y.grad}") # [2.0] = x
๐ง Build a Neural Network from Scratch
import tensorweaver as torch
class MLP(torch.nn.Module):
def __init__(self):
super().__init__()
self.fc1 = torch.nn.Linear(784, 128)
self.relu = torch.nn.ReLU()
self.fc2 = torch.nn.Linear(128, 10)
def forward(self, x):
x = self.relu(self.fc1(x))
return self.fc2(x)
# Every operation is transparent!
model = MLP()
print(model) # See the architecture
Instead of mysterious "black box" operations, TensorWeaver shows you:
- Transparent code - Every function is readable Python
- Step-by-step execution - See exactly how gradients flow
- PyTorch compatibility - Easy transition to production frameworks
- Educational focus - Built for understanding, not just using
Real testimonials coming as the community grows!
# Option 1: Install from PyPI (recommended)
pip install tensorweaver
# Option 2: Install from source (for contributors)
git clone https://github.com/howl-anderson/tensorweaver.git
cd tensorweaver
poetry install
- ๐งช Try Examples - Hands-on Jupyter notebooks
- ๐ฎ Interactive Playground - No setup required
- ๐ฌ Join Community - Ask questions and share projects
- ๐ Read Documentation - Framework overview (expanding soon)
TensorWeaver thrives on community contributions! Whether you're:
- ๐ Reporting bugs
- ๐ก Suggesting features
- ๐ Improving documentation
- ๐งช Adding examples
- ๐ง Writing code
We welcome you! Please open an issue or submit a pull request - contribution guidelines coming soon!
- ๐ Documentation - Framework overview
- ๐ฌ Discussions - Community Q&A
- ๐ Issues - Bug reports and feature requests
- ๐ง Follow Updates - Star/watch for latest changes
Using TensorWeaver in your course? We'd love to help!
- ๐ฏ Working Examples - Ready-to-use Jupyter notebooks
- ๐ฌ Get Support - Ask questions and get help
- ๐ง Contact Us - Let us know about your educational use case
Curriculum materials and instructor resources are in development - please reach out if you're interested!
If TensorWeaver helped you understand deep learning better, please consider starring the repository! It helps others discover this educational resource.
TensorWeaver is MIT licensed. See LICENSE for details.
- Inspired by educational frameworks: Micrograd, TinyFlow, and DeZero
- Thanks to the PyTorch team for the API design
- Grateful to all contributors and the open-source community
Ready to peek behind the curtain?
๐ Start Learning at tensorweaver.ai