neural_network.back_propagation_neural_network

A Framework of Back Propagation Neural Network (BP) model

Easy to use:
  • add many layers as you want ! ! !

  • clearly see how the loss decreasing

Easy to expand:
  • more activation functions

  • more loss functions

  • more optimization method

Author: Stephen Lee Github : https://github.com/RiptideBo Date: 2017.11.23

Classes

BPNN

Back Propagation Neural Network model

DenseLayer

Layers of BP neural network

Functions

example()

sigmoid(→ numpy.ndarray)

Module Contents

class neural_network.back_propagation_neural_network.BPNN

Back Propagation Neural Network model

add_layer(layer)
build()
cal_loss(ydata, ydata_)
plot_loss()
summary()
train(xdata, ydata, train_round, accuracy)
ax_loss
fig_loss
layers = []
train_mse = []
class neural_network.back_propagation_neural_network.DenseLayer(units, activation=None, learning_rate=None, is_input_layer=False)

Layers of BP neural network

back_propagation(gradient)
cal_gradient()
forward_propagation(xdata)
initializer(back_units)
activation
bias = None
is_input_layer
learn_rate
units
weight = None
neural_network.back_propagation_neural_network.example()
neural_network.back_propagation_neural_network.sigmoid(x: numpy.ndarray) numpy.ndarray