How to Build a Basic Neural Network in Python
Basic Neural Network:
Implement a simple neural network for a beginner's understanding of machine learning.
From AWS - A neural network is a method in artificial intelligence that teaches computers to process data in a way that is inspired by the human brain. It is a type of machine learning process, called deep learning, that uses interconnected nodes or neurons in a layered structure that resembles the human brain.
In this Basic Neural Network project, the user interacts with the neural network by supplying input data for training and testing. The program outputs predicted output values generated by the neural network based on the input data provided for testing. This project provides a beginner's understanding of machine learning and neural networks.
Input values:
User interacts with the neural network by providing input data (features) for training and testing.
Output value:
Predicted output values generated by the neural network based on the input data.
Example:
Input values (Training Data): - Features: [0.1, 0.2, 0.3] - Target output: 0 Output value (Training): Neural network trains on the provided input features and target output. Input values (Testing Data): - Features: [0.4, 0.5, 0.6] Output value (Testing): Neural network predicts the output based on the input features. Predicted output: 1
Solution 1: Neural Network Using NumPy
This solution builds a simple neural network with one hidden layer using only NumPy for matrix operations.
Code:
import numpy as np
# Sigmoid activation function and its derivative
def sigmoid(x):
return 1 / (1 + np.exp(-x))
def sigmoid_derivative(x):
return x * (1 - x)
# Neural network class
class SimpleNeuralNetwork:
def __init__(self):
# Initialize weights with random values
np.random.seed(1)
self.weights1 = np.random.rand(3, 4) # Input layer to hidden layer
self.weights2 = np.random.rand(4, 1) # Hidden layer to output layer
# Feedforward function to predict output
def feedforward(self, inputs):
self.layer1 = sigmoid(np.dot(inputs, self.weights1))
self.output = sigmoid(np.dot(self.layer1, self.weights2))
return self.output
# Backpropagation function to train the network
def backpropagation(self, inputs, target, learning_rate):
output_error = target - self.output
d_output = output_error * sigmoid_derivative(self.output)
layer1_error = d_output.dot(self.weights2.T)
d_layer1 = layer1_error * sigmoid_derivative(self.layer1)
# Update weights
self.weights2 += self.layer1.T.dot(d_output) * learning_rate
self.weights1 += inputs.T.dot(d_layer1) * learning_rate
# Training function
def train(self, inputs, targets, iterations, learning_rate):
for i in range(iterations):
self.feedforward(inputs)
self.backpropagation(inputs, targets, learning_rate)
# Example usage
nn = SimpleNeuralNetwork()
X = np.array([[0.1, 0.2, 0.3], [0.4, 0.5, 0.6]]) # Training features
y = np.array([[0], [1]]) # Target output
nn.train(X, y, iterations=10000, learning_rate=0.1)
# Testing
test_input = np.array([0.4, 0.5, 0.6])
prediction = nn.feedforward(test_input)
print("Predicted output:", prediction)
Output:
Predicted output: [0.90686614]
Explanation:
- Defined a sigmoid activation function and its derivative for backpropagation.
- Created a SimpleNeuralNetwork class to handle initialization, feedforward, and backpropagation processes.
- The train function trains the network on the input data over multiple iterations.
- Example usage trains the network on sample data and predicts output for a test input.
Solution 2: Neural Network Using scikit-learn
In this solution, we utilize the MLPClassifier from the scikit-learn library, which abstracts the complexities of creating a neural network.
Code:
from sklearn.neural_network import MLPClassifier
import numpy as np
# Input data (training features and target output)
X = np.array([[0.1, 0.2, 0.3], [0.4, 0.5, 0.6]])
y = np.array([0, 1]) # Target output for training
# Create and train the neural network
nn = MLPClassifier(hidden_layer_sizes=(4,), activation='logistic', max_iter=10000)
nn.fit(X, y)
# Test the model
test_input = np.array([[0.4, 0.5, 0.6]])
prediction = nn.predict(test_input)
print("Predicted output:", prediction)
Output:
Predicted output: [1]
Explanation:
- Imported MLPClassifier from scikit-learn to implement a neural network.
- Created input data for training and defined the target output.
- Initialized and trained the MLPClassifier with one hidden layer of 4 neurons.
- Used the trained model to predict output based on new input data.
It will be nice if you may share this link in any developer community or anywhere else, from where other developers may find this content. Thanks.
https://198.211.115.131/projects/python/python-basic-neural-network-project.php
- Weekly Trends and Language Statistics
- Weekly Trends and Language Statistics