Lec 09-1  Neural Nets(NN) for XOR

1. Forward Propagation 



  


2. NN for Vector 


   


K= tf.sigmoid(tf.matmul(X,W1) + b1)

hypothesis = tf.sigmoid(tf.matmul(K,W2) + b2 )


Lec 09-x  미분 정리하기 

1. Basic derivative 


    


2. Partial derivative : consider other variables as constants



Lec 09-2  딥넷트웍 학습 시키기 (backpropagation)

1. Back propagation 


   1) 1974, 1982 by Paul Werbos, 1986 by Hinton

  

    

    https://devblogs.nvidia.com/parallelforall/inference-next-step-gpu-accelerated-deep-learning/

 

2. Back Propagation (chain rule)

  1) forward (w =-2, x=5, b=3) 

  2) backward : derivative


    

    3) derivative sigmoid

     

       



Lab 09-1 : Neural Net for XOR

1. XOR data set 

x_data = np.array([[0, 0], [0, 1], [1, 0], [1, 1]], dtype=np.float32)

y_data = np.array([[0],    [1],    [1],    [0]], dtype=np.float32)


2. Single Model로는 풀수 없음 


3. Neural Nets for XOR 

W = tf.Variable(tf.random_normal([2, 1]), name='weight')

b = tf.Variable(tf.random_normal([1]), name='bias')


# Hypothesis using sigmoid: tf.div(1., 1. + tf.exp(tf.matmul(X, W)))

hypothesis = tf.sigmoid(tf.matmul(X, W) + b)



W1 = tf.Variable(tf.random_normal([2, 2]), name='weight1')

b1 = tf.Variable(tf.random_normal([2]), name='bias1')

layer1 = tf.sigmoid(tf.matmul(X, W1) + b1)


W2 = tf.Variable(tf.random_normal([2, 1]), name='weight2')

b2 = tf.Variable(tf.random_normal([1]), name='bias2')

hypothesis = tf.sigmoid(tf.matmul(layer1, W2) + b2)


Hypothesis:  [[ 0.01338216]

 [ 0.98166394]

 [ 0.98809403]

 [ 0.01135799]] 

Correct:  [[ 0.]

 [ 1.]

 [ 1.]

 [ 0.]] 

Accuracy:  1.0


4. Wide & Deep 

    1) Wide : 변수의 갯수를 늘리는 것 

    2) Deep : 모델 수를 더 많이 사용하는 것 (



Lab 09-2 : Tensorboard (Neural Net for XOR)

1. 5 steps of using TensorBoard 


1 step : From TF graph, decide which tensors you want to log

w2_hist = tf.summary.histogram("weights2", W2)

cost_summ = tf.summary.scalar("cost", cost)

2 step : Merge all summaries

summary = tf.summary.merge_all()

3 step : Create writer and add graph

# Create summary writer

writer = tf.summary.FileWriter(‘./logs’)

writer.add_graph(sess.graph)

4 step : Run summary merge and add_summary

s, _ = sess.run([summary, optimizer], feed_dict=feed_dict)

writer.add_summary(s, global_step=global_step)

5 step : Launch TensorBoard

tensorboard --logdir=./logs


2. Result of tensorflow





Posted by 꿈을펼쳐라
,