Building VanillaNN: A Simplified Neural Network Framework from Scratch in Python

In this article, we’ll delve into creating a foundational framework for a neural network, reminiscent of the Keras model, entirely from scratch in Python. Before proceeding, it’s essential to have a foundational grasp of neural networks, including concepts like forward-backward propagation, loss functions, optimizers, and regularization techniques. With the introduction out of the way, let’s dive straight in!

For context, a standard neural network model in the Keras framework appears as follows. Our goal with this custom implementation is to mirror the Keras framework, albeit in a simplified form.

# define the keras model
model = Sequential()
model.add(Dense(12, input_dim=8, activation='relu'))
model.add(Dense(8, activation='relu'))
model.add(Dense(1, activation='sigmoid'))

model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])
model.fit(X, y, epochs=150, batch_size=10)

Creating the Core Neural Network Class

For a structured and well-organized neural network, it’s vital to encapsulate its key functionalities. To this end, we’re introducing the Vanilla class. This class will not only house the main functions of our neural network but also maintain variables that define its structure and parameter settings.

Website