Skip to content

ahammadnafiz/FizTorch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

35 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

FizTorch: A Toy Tensor Library for Machine Learning

Logo Python Version License Build Status

FizTorch is a lightweight deep learning framework designed for educational purposes and small-scale projects. It provides a simple and intuitive API for building and training neural networks, inspired by popular frameworks like PyTorch.

Table of Contents

Features

  • Tensor Operations: Basic tensor operations with support for automatic differentiation.
  • Neural Network Layers: Common neural network layers such as Linear and ReLU.
  • Sequential Model: Easy-to-use sequential model for stacking layers.
  • Functional API: Functional operations for common neural network functions.

Installation

To install FizTorch, follow these steps:

  1. Clone the Repository:

    git clone https://github.com/ahammadnafiz/FizTorch.git
    cd FizTorch
  2. Set Up a Virtual Environment (optional but recommended):

    python -m venv fiztorch-env
    source fiztorch-env/bin/activate  # On Windows, use `fiztorch-env\Scripts\activate`
  3. Install Dependencies:

    pip install -r requirements.txt
  4. Install FizTorch:

    pip install -e .

Usage

Here is a simple example of how to use FizTorch to build and train a neural network:

import numpy as np
from fiztorch.tensor import Tensor
from fiztorch.nn import Linear, ReLU, Sequential
import fiztorch.nn.functional as F

# Define a simple neural network
model = Sequential(
    Linear(2, 3),
    ReLU(),
    Linear(3, 1)
)

# Create some input data
input = Tensor([[1.0, 2.0]], requires_grad=True)

# Forward pass
output = model(input)

# Backward pass
output.backward()

# Print the gradients
print(input.grad)

Examples

MNIST HAND DIGIT TEST

Neural network training on MNIST digits using FizTorch library with Adam optimizer (configurable learning rate), batch support, real-time accuracy/loss tracking Training Process

California Housing TEST

Neural network training on California Housing Dataset using FizTorch library Training Process

Linear Layer

from fiztorch.tensor import Tensor
from fiztorch.nn import Linear

# Create a linear layer
layer = Linear(2, 3)

# Create some input data
input = Tensor([[1.0, 2.0]])

# Forward pass
output = layer(input)

# Print the output
print(output)

ReLU Activation

from fiztorch.tensor import Tensor
from fiztorch.nn import ReLU

# Create a ReLU activation
relu = ReLU()

# Create some input data
input = Tensor([-1.0, 0.0, 1.0])

# Forward pass
output = relu(input)

# Print the output
print(output)

Sequential Model

from fiztorch.tensor import Tensor
from fiztorch.nn import Linear, ReLU, Sequential

# Define a sequential model
model = Sequential(
    Linear(2, 3),
    ReLU(),
    Linear(3, 1)
)

# Create some input data
input = Tensor([[1.0, 2.0]])

# Forward pass
output = model(input)

# Print the output
print(output)

Contributing

Contributions are welcome! Please follow these steps to contribute:

  1. Fork the repository.
  2. Create a new branch (git checkout -b feature-branch).
  3. Commit your changes (git commit -am 'Add new feature').
  4. Push to the branch (git push origin feature-branch).
  5. Create a new Pull Request.

License

FizTorch is licensed under the MIT License. See the LICENSE file for more information.

Contact

For any questions or feedback, please open an issue or contact the maintainers.


Made with ❤️ by ahammadnafiz