Lab 00 - TensorFlow의 설치 및 기본적인 Operation

 


1. TensorFlow

  • Open source software library for Numerial computation using data flow graphs.
  • Python

2.  Data Flow Graph

  • Nodes : mathermatical operations
  • Edges : multidimensional data arrays (tensors) communicated between them.

3. Check installation and version    (1.4)

 

 

4. Example

 

  1) Hello world

 

  2) Node Add 

 

  3) Placeholder

 

 

 

 

 

 

TensorFlow Mechanics 

 

1. Build graph using TensorFlow operations

2. feed data and run graph (operation) : sess.run(op)

3. update variables in the grap (and return values)

 

 

 

4. Tenor Ranks, Shapes, and Types

  1) Rank

  2) Shape

  3) Type

 

 

 

Lab 02 - TensorFlow로 구현한 Linear regression

 

1. Example

 

2. Linear Regression with Placeholder

 

 

 

 

Lab 03 - Linear regression : minimize Cost

 

 

1. plot cost function

 

 

2. Optimized by Hand

 

 

3. Optimized by Gradientdecent Function

 

4. Calculate gradient value

 

 

 

 

Lab 04 - multi-variable Linear regression

 

1. Multi-Variable  

 

 

2. Multi-Variable with matrix

 

 

 

3. Slice Matrix

 

 

4. file data

 

 

5. Queue Runners

 

 

 

[에러 발생함]  일단 진행

 

 

 

 

 

Posted by 꿈을펼쳐라
,

강의 웹사이트: http://hunkim.github.io/ml/

Facebook: https://www.facebook.com/groups/Tenso...

소스코드: https://github.com/hunkim/DeepLearnin...

 

 

Lec 00 - Machine/Deep learning 수업의 개요와 일정



 

Lec 01 - 머신러닝의 개념과 용어

 

1. Machine Learning

  - "Field of study that gives computers the ability to learn without being explicitly programmed” Arthur Samuel (1959)

 

2. Supervised/Unsupervised learning

  - Supervised learning:

     . learning with labeled examples

  - Unsupervised learning: un-labeled data

     . Google news grouping

     . Word clustering

 

3. Types of supervised learning

  • Predicting final exam score based on time spent

    - regression

  • Pass/non-pass based on time spent

    - binary classification

  • Letter grade (A, B, C, E and F) based on time spent

    - multi-label classification



Lec 02 - Linear Regression의 Hypothesis 와 cost 설명

1. Linear hypothesis

    H(x) = W x + b

 

2. Cost

    H(x) - y

  

 

     H(x) = W x + b

 

   

 

  3. Goal : Minizie Cost

 

      minimize cost(W,b)

 

 

 

Lec 03 - Linear Regression의 cost 최소화 알고리즘 설명

 

1. Hypothesis and Cost

  • H(x) = W x

        

 

 2. Gradient descent algorithm Minimize cost function

  • formal definition

        

  • How about Newton-Rapshon algorithm ?

 

 3. Derivative Calculator : [link]

 

 4. Check cost function 

  • must be convex curve

 

Lec 04 - Multi-variable linear regression

 

1. Hypothesis for multi-variable

 

   

 

2. Hypothesis using Matrix

    

    

 

 

Posted by 꿈을펼쳐라
,