Skip to content

Dox45/Nexdl

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Nexdl

Nexdl (Next Deep Learning) is a minimal, hackable, and educational autograd engine and neural network library for Python. It is designed to be a transparent and easily understandable implementation of modern deep learning concepts, heavily inspired by PyTorch.

Philosophy

The core philosophy of Nexdl is simplicity and transparency.

  • Pure Python/NumPy: The entire core logic is written in high-level Python using NumPy for backend operations. This makes the code easy to read, debug, and modify.
  • PyTorch-like API: If you know PyTorch, you already know Nexdl. We aim to keep the API surface as close to PyTorch as possible for familiar usage.
  • Hackable: Nexdl is built for research and education. Want to implement a custom autograd function? It's just a few lines of Python. Need to see how backpropagation works? Just read the tensor.py file.

Features

  • Automatic Differentiation (Autograd): Full reverse-mode customization.
  • Dynamic Computational Graph: Define-by-Run execution.
  • Neural Network Layers: use nexdl.nnfor standard layers likeLinear, Conv2d(coming soon),RNN`, etc.
  • Optimizers: SGD, Adam, AdamW.
  • Extensible: Easily add new operations by subclassing Function.

Installation

You can install Nexdl directly from the source.

git clone https://github.com/yourusername/Nexdl.git
cd Nexdl
pip install -e .

Prerequisites:

  • Python 3.7+
  • NumPy

Quick Start

1. Tensors and Autograd

The core of Nexdl is the Tensor object, which tracks operations for automatic differentiation.

import nexdl as nx

# Create tensors
x = nx.tensor([1.0, 2.0, 3.0], requires_grad=True)
y = nx.tensor([4.0, 5.0, 6.0], requires_grad=True)

# Perform operations
z = x * y + x.sum()

# Compute gradients
z.sum().backward()

print(f"x.grad: {x.grad}")
print(f"y.grad: {y.grad}")

2. Building Neural Networks

Nexdl provides a Module class to organize your neural networks, just like PyTorch.

import nexdl as nx
import nexdl.nn as nn
import nexdl.optim as optim

class SimpleNet(nn.Module):
    def __init__(self):
        super().__init__()
        self.fc1 = nn.Linear(10, 20)
        self.relu = nn.ReLU()
        self.fc2 = nn.Linear(20, 1)

    def forward(self, x):
        x = self.fc1(x)
        x = self.relu(x)
        x = self.fc2(x)
        return x

# Initialize model and optimizer
model = SimpleNet()
optimizer = optim.SGD(model.parameters(), lr=0.01)

# Dummy Input
input_data = nx.randn(5, 10)
target = nx.randn(5, 1)

# Training Step
optimizer.zero_grad()
output = model(input_data)
loss = ((output - target) ** 2).mean() # MSE Loss
loss.backward()
optimizer.step()

print(f"Loss: {loss.item()}")

Core Concepts

Tensor

The Tensor class is the main data structure. It wraps a NumPy array and adds autograd capabilities.

  • requires_grad=True: Tracks operations on this tensor.
  • backward(): Computes gradients for all tensors in the computational graph that have requires_grad=True.

Function

Every operation (add, sub, mul, etc.) is implemented as a subclass of Function.

  • forward(ctx, *args): Computes the output.
  • backward(ctx, grad_output): Computes the gradients for the inputs.

Module

The nn.Module class is the base class for all neural network modules.

  • Automatically tracks Parameters.
  • Supports state_dict() for saving/loading models.
  • Handles train() and eval() modes.

Contributing

Nexdl is an open project for learning and experimentation. Pull requests are welcome!

  1. Fork the repository.
  2. Create your feature branch (git checkout -b feature/amazing-feature).
  3. Commit your changes (git commit -m 'Add some amazing feature').
  4. Push to the branch (git push origin feature/amazing-feature).
  5. Open a Pull Request.

License

MIT

About

Next Deep Learning is a minimal, hackable, and educational autograd engine and neural network library for Python.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages