Neural network based on Numpy: how to join and adjust dropout?

Together with DeepMind Data Scientist and Udacity Deep Learning Instructor Andrew Trask, the concept of dropout is more deeply understood based on the Numpy Handwriting Neural Network.

Summary: Dropouts are used in almost all current state-of-the-art neural networks. This tutorial shows how to add Dropout to a neural network with a few lines of Python code. After completing this tutorial, you will have a working dropout implementation, and Master the skill of adding and adjusting dropouts in any neural network.

If you are interested in my article, you are welcome to follow @iamtrask on Twitter and welcome to give me feedback.

Give me code directly

Import numpy as np

X = np.array([ [0,0,1],[0,1,1],[1,0,1],[1,1,1] ])

y = np.array([[0,1,1,0]]).T

Alpha,hidden_dim,dropout_percent,do_dropout = (0.5,4,0.2,True)

Synapse_0 = 2*np.random.random((3,hidden_dim)) - 1

Synapse_1 = 2*np.random.random((hidden_dim,1)) - 1

For j in xrange(60000):

Layer_1 = (1/(1+np.exp(-(np.dot(X,synapse_0)))))

If(do_dropout):

Layer_1 *= np.random.binomial([np.ones((len(X),hidden_dim))],1-dropout_percent)[0] * (1.0/(1-dropout_percent))

Layer_2 = 1/(1+np.exp(-(np.dot(layer_1,synapse_1))))

Layer_2_delta = (layer_2 - y)*(layer_2*(1-layer_2))

Layer_1_delta = layer_2_delta.dot(synapse_1.T) * (layer_1 * (1-layer_1))

Synapse_1 -= (alpha * layer_1.T.dot(layer_2_delta))

Synapse_0 -= (alpha * XTdot (layer_1_delta))

First, what is a dropout?

As mentioned in the previous article, neural networks are a beautifying search problem. Each node in the neural network searches for the correlation between the input data and the correct output data.

Consider the picture in the previous article. The curve represents the error generated by the network for each specific weight. The low point of the curve (pronounced: low error) marks the relationship between the weight "finding" the input and the output. The ball in the figure marks different weights. They are all trying to find the low point.

Think about color. The initial position of the ball is randomly generated (just like the weight of the neural network). If two balls start randomly in the same color area, they will converge to the same point. There is redundancy here! Wasting power and memory! This is exactly what happens in neural networks.

Why dropout: dropout helps prevent weights from converging in the same place. It does this by randomly shutting down nodes in the forward propagation phase. Then activate all nodes when backpropagating. Let us take a closer look.

Second, how to join and adjust the dropout?

In order to perform a dropout on the network layer, we randomly set the layer's value to 0 in the forward propagation stage—see line 10.

Line 9: Whether parameterization uses dropout. We only intend to use dropout during the training phase. Do not use dropout at run time, and do not use dropout on test dataset. In addition, this line also means that we need to increase forward propagation. The value of This is proportional to the number of closed values. A simple intuition is that if you close half of the hidden layers, then you need to double forward-propagating values ​​to correctly compensate for the output. Thanks @karpathy for pointing this out.

Adjusted best practices

Line 4: Percentage of parameterized dropouts. This affects the probability of closing any one node. For hidden layers, a better initial value is 50%. If you apply dropout to the input layer, it is best not to exceed 25%.

Hinton advocates adjusting the size of the hidden layer while adjusting the dropout. First close the dropout and increase the hidden layer size until you perfectly fit your data. Then, using the same hidden layer size, turn on dropout for training. This should be a near-optimal configuration. Once the training is over, close the dropout. Long live! You have a neural network that can work!

Outdoor Access Point

Outdoor Access Point,Wifi 5 Outdoor Wireless Ap,Wifi 6 Outdoor Wireless Ap,1Km 300Mbps Outdoor Access Point

Shenzhen MovingComm Technology Co., Ltd. , https://www.movingcommtech.com