quacknet
QuackNet
QuackNet is a Python based building and training neural networks and convolutional networks entirely from scratch. It offers foundational implementations of key components such as forward propagation, backpropagation and optimisation algorithms, without relying on machine learning frameworks like TensorFlow or PyTorch
Key Features
1. Custom Implementation:
- Fully handwritten layers, activation functions and loss functions.
- No reliance on external libraries for machine learning (except for numpy)
2. Core neural network functionality:
- Support for common activation functions (eg.Leaky ReLU, Sigmoid, Softmax)
- Multiple loss functions with derivatives (eg. MSE, MAE, Cross entropy)
3. Training:
- includes backpropagation for gradient calculation and parameter updates
- Optimisers: Gradient Descent, Stochastic Gradient Descent (SGD), and Adam optimiser.
- Supports batching for efficient training.
4. Layer Support:
- Fully Connected Layer (Dense)
- Convolutional
- Pooling (Max and Average)
- Global Average Pooling
- Activation Layers
5. Additional Features:
- Save and load model weights and biases.
- Evaluation metrics including accuracy and loss.
- Visualisation tools for training progress.
- Demo projects like MNIST and HAM10000 classification.
Installation
QuackNet is simple to install via PyPI.
Install via PyPI
pip install QuackNet
Usage Example
from quacknet.main import Network
# Define a neural network architecture
n = Network(
lossFunc = "cross entropy",
learningRate = 0.01,
optimisationFunc = "sgd", #stochastic gradient descent
)
n.addLayer(3, "relu") # Input layer
n.addLayer(2, "relu") # Hidden layer
n.addLayer(1, "softmax") # Output layer
n.createWeightsAndBiases()
# Example data
inputData = [[0.1, 0.2, 0.3], [0.4, 0.5, 0.6]]
labels = [[1], [0]]
# Train the network
accuracy, averageLoss = n.train(inputData, labels, epochs = 10)
# Evaluate
print(f"Accuracy: {accuracy}%")
print(f"Average loss: {averageLoss}")
Examples
- Simple Neural Network Example: A basic neural network implementation demonstrating forward and backpropagation
- Convolutional Neural Network Example: Shows how to use the convolutional layers in the library
- MNIST Neural Network Example: Shows how to use neural network to train on MNIST
Highlights
- Custom Architectures: Define and train neural networks with fully customisable architectures
- Optimisation Algorithms: Includes Gradient Descent, Stochastic Gradient Descent and Adam optimiser for efficient training
- Loss and Activation Functions: Prebuilt support for common loss and activation functions with the option to make your own
- Layer Support:
- Fully Connected (Dense)
- Convolutional
- Pooling (max and Average)
- Global Average Pooling
- Activation layer
- Evaluation Tools: Includes metrics for model evaluation such as accuracy and loss
- Save and Load: Save weights and biases for reuse for further training
- Demo Projects: Includes example implementations such as MNIST digit classification
Code structure
Neural Network Class
- Purpose Handles fully connected layers for standard neural network
- Key Components:
- Layers: Dense Layer
- Functions: Forward propagation, backpropagation
- Optimisers: SGD, GD, GD using batching
Convolutional Neural Network Class
- Purpose Specialised for image data processing using convolutional layers
- Key Components:
- Layers: Convolutional, pooling, dense and activation layers
- Functions: Forward propagation, backpropagation, flattening, global average pooling
- Optimisers: Adam optimiser, SGD, GD, GD using batching
Related Projects
Skin Lesion Detector
A convolutional neural network (CNN) skin lesion classification model built with QuackNet, trained using the HAM10000 dataset. This model achieved 60.2% accuracy on a balanced validation set.
You can explore the full project here: Skin Lesion Detector Repository
License
This project is licensed under the MIT License - see the LICENSE file for details.
1""" 2# QuackNet 3 4**QuackNet** is a Python based building and training neural networks and convolutional networks entirely from scratch. It offers foundational implementations of key components such as forward propagation, backpropagation and optimisation algorithms, without relying on machine learning frameworks like TensorFlow or PyTorch 5 6## Key Features 7 8**1. Custom Implementation:** 9- Fully handwritten layers, activation functions and loss functions. 10- No reliance on external libraries for machine learning (except for numpy) 11 12**2. Core neural network functionality:** 13- Support for common activation functions (eg.Leaky ReLU, Sigmoid, Softmax) 14- Multiple loss functions with derivatives (eg. MSE, MAE, Cross entropy) 15 16**3. Training:** 17- includes backpropagation for gradient calculation and parameter updates 18- Optimisers: Gradient Descent, Stochastic Gradient Descent (SGD), and Adam optimiser. 19- Supports batching for efficient training. 20 21**4. Layer Support:** 22- Fully Connected Layer (Dense) 23- Convolutional 24- Pooling (Max and Average) 25- Global Average Pooling 26- Activation Layers 27 28**5. Additional Features:** 29- Save and load model weights and biases. 30- Evaluation metrics including accuracy and loss. 31- Visualisation tools for training progress. 32- Demo projects like MNIST and HAM10000 classification. 33 34## Installation 35 36QuackNet is simple to install via PyPI. 37 38**Install via PyPI** 39 40``` 41pip install QuackNet 42``` 43 44## Usage Example 45 46```Python 47from quacknet.main import Network 48 49# Define a neural network architecture 50n = Network( 51 lossFunc = "cross entropy", 52 learningRate = 0.01, 53 optimisationFunc = "sgd", #stochastic gradient descent 54) 55n.addLayer(3, "relu") # Input layer 56n.addLayer(2, "relu") # Hidden layer 57n.addLayer(1, "softmax") # Output layer 58n.createWeightsAndBiases() 59 60# Example data 61inputData = [[0.1, 0.2, 0.3], [0.4, 0.5, 0.6]] 62labels = [[1], [0]] 63 64# Train the network 65accuracy, averageLoss = n.train(inputData, labels, epochs = 10) 66 67# Evaluate 68print(f"Accuracy: {accuracy}%") 69print(f"Average loss: {averageLoss}") 70``` 71 72## Examples 73 74- [Simple Neural Network Example](/ExampleCode/NNExample.py): A basic neural network implementation demonstrating forward and backpropagation 75- [Convolutional Neural Network Example](/ExampleCode/CNNExample.py): Shows how to use the convolutional layers in the library 76- [MNIST Neural Network Example](/ExampleCode/MNISTExample/mnistExample.py): Shows how to use neural network to train on MNIST 77 78## Highlights 79 80- **Custom Architectures:** Define and train neural networks with fully customisable architectures 81- **Optimisation Algorithms:** Includes Gradient Descent, Stochastic Gradient Descent and Adam optimiser for efficient training 82- **Loss and Activation Functions:** Prebuilt support for common loss and activation functions with the option to make your own 83- **Layer Support:** 84 - Fully Connected (Dense) 85 - Convolutional 86 - Pooling (max and Average) 87 - Global Average Pooling 88 - Activation layer 89- **Evaluation Tools:** Includes metrics for model evaluation such as accuracy and loss 90- **Save and Load:** Save weights and biases for reuse for further training 91- **Demo Projects:** Includes example implementations such as MNIST digit classification 92 93## Code structure 94 95### Neural Network Class 96- **Purpose** Handles fully connected layers for standard neural network 97- **Key Components:** 98 - Layers: Dense Layer 99 - Functions: Forward propagation, backpropagation 100 - Optimisers: SGD, GD, GD using batching 101 102### Convolutional Neural Network Class 103- **Purpose** Specialised for image data processing using convolutional layers 104- **Key Components:** 105 - Layers: Convolutional, pooling, dense and activation layers 106 - Functions: Forward propagation, backpropagation, flattening, global average pooling 107 - Optimisers: Adam optimiser, SGD, GD, GD using batching 108 109## Related Projects 110 111### Skin Lesion Detector 112 113A convolutional neural network (CNN) skin lesion classification model built with QuackNet, trained using the HAM10000 dataset. This model achieved 60.2% accuracy on a balanced validation set. 114 115You can explore the full project here: 116[Skin Lesion Detector Repository](https://github.com/SirQuackPng/skinLesionDetector) 117 118## License 119 120This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details. 121"""