Tensorflow keras MNIST image of Optimization & Training (beginner)

Tensorflow 공식 사이트 이미지 분석 초보자 예제

초보자 예제

https://www.tensorflow.org/tutorials/quickstart/beginner

학습과정

Import tensorflow keras layers And MNIST

import tensorflow as tf
from tensorflow.keras import layers
import numpy as np
import matplotlib.pyplot as plt
from tensorflow.keras import datasets

(train_x, train_y), (test_x, test_y) = datasets.mnist.load_data()

Build Model

inputs = layers.Input((28, 28, 1))
net = layers.Conv2D(32, (3, 3), padding='SAME')(inputs)
net = layers.Activation('relu')(net)
net = layers.Conv2D(32, (3, 3), padding='SAME')(net)
net = layers.Activation('relu')(net)
net = layers.MaxPooling2D(pool_size=(2, 2))(net)
net = layers.Dropout(0.25)(net)

net = layers.Conv2D(64, (3, 3), padding='SAME')(net)
net = layers.Activation('relu')(net)
net = layers.Conv2D(64, (3, 3), padding='SAME')(net)
net = layers.Activation('relu')(net)
net = layers.MaxPooling2D(pool_size=(2, 2))(net)
net = layers.Dropout(0.25)(net)

net = layers.Flatten()(net)
net = layers.Dense(512)(net)
net = layers.Activation('relu')(net)
net = layers.Dropout(0.5)(net)
net = layers.Dense(10)(net)  # num_classes
net = layers.Activation('softmax')(net)

model = tf.keras.Model(inputs=inputs, outputs=net, name='Basic_CNN')

Model Compile

Loss Function
Optimization
Metrics

Optimization

모델을 학습하기 전 설정

  • Loss Function
  • Optimization
  • Metrics

Loss Function

평가지표. 검증셋과 연관. 훈련 과정을 모니터링하는데 사용. 

loss = 'binary_crossentropy'
loss = 'categorical_crossentropy'

tf.keras.losses.sparse_categorical_crossentropy
tf.keras.losses.categorical_crossentropy
tf.keras.losses.binary_crossentropy

Binary Crossentropy : 2개의 레이블 클래스(0, 1로 가정)가 있을 때 Binary Crossentropy를 사용하면 좋다

tf.keras.losses.sparse_categorical_crossentropy tf.keras.losses.categorical_crossentropy

Metrics

평가지표 검증셋과 연관 훈련 과정을 모니터링하는데 사용.

tf.keras.metrics.Accuracy()

metrics = ['accuracy']
metrics = tf.keras.metrics.Accuracy()

Compile

Optimization

최적화

  • tf.keras.optimizers.RMSprop
  • tf.keras.optimizers.SGD tf.keras.optimizers.Adam()
optm = tf.keras.optimizers.Adam()
tf.keras.optimizers.RMSprop
tf.keras.optimizers.SGD
model.compile(optimizer=optm, 
              loss=tf.keras.losses.SparseCategoricalCrossentropy(), 
              metrics= [tf.keras.metrics.SparseCategoricalAccuracy()])

Prepare Dataset

학습에 사용할 데이터셋 준비

train_x.shape #차원수 확인
test_x.shape # 차원수 확인
train_x = train_x[..., tf.newaxis] # 차원수 증가
test_x = test_x[..., tf.newaxis] # 차원수 증가 

# Rescaling
train_x = train_x / 255
test_x = test_x / 255

Training

학습 시작

epochs : 학습 횟수
batch : 컴퓨터 자원 효율을 위해 n개씩 학습

num_epochs = 1
batch_size = 16

train_x = tf.cast(train_x,dtype=tf.float32)
train_y = tf.cast(train_y,dtype=tf.float32)

hist = model.fit(train_x, train_y, 
                 batch_size=batch_size, 
                 shuffle=True, 
                 epochs=num_epochs) 

Model Fit 실행중 Value 에러가 발생했다면 참고자료

학습 확인

hist.history

Check History

predictModel = model.predict(train_x)
predictModel

답글 남기기

이메일 주소는 공개되지 않습니다.