卷积神经网络的简单实现(tensorflow)

发布于:2023-01-22 ⋅ 阅读:(11) ⋅ 点赞:(0) ⋅ 评论:(0)
import tensorflow as tf
import matplotlib.pyplot as plt
from tensorflow.keras import datasets, layers, models

#下载并准备数据集
fashion_mnist = tf.keras.datasets.fashion_mnist
(x_train, y_train), (x_test, y_test) = fashion_mnist.load_data()
x_train, x_test = x_train / 255.0, x_test / 255.0

# Add a channels dimension
x_train = x_train[..., tf.newaxis].astype("float32")
x_test = x_test[..., tf.newaxis].astype("float32")


#使用 tf.data 来将数据集切分为 batch 以及混淆数据集:
train_ds = tf.data.Dataset.from_tensor_slices(
    (x_train, y_train)).shuffle(10000).batch(32)
test_ds = tf.data.Dataset.from_tensor_slices((x_test, y_test)).batch(32)

#构造卷积神经网络模型
model = models.Sequential()
model.add(layers.Conv2D(32, (3, 3), activation='relu', input_shape=(28, 28, 1)))
model.add(layers.MaxPooling2D((2, 2)))
model.add(layers.Conv2D(64, (3, 3), activation='relu'))
model.add(layers.MaxPooling2D((2, 2)))
model.add(layers.Conv2D(64, (3, 3), activation='relu'))
model.add(layers.Flatten())
model.add(layers.Dense(512, activation='relu'))
model.add(layers.Dense(10))
model.summary()


#编译并训练模型
model.compile(optimizer='adam',
              loss=tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True),
              metrics=['accuracy'])

history = model.fit(train_ds, epochs=10,
                    validation_data=(test_ds))

#评估模型
%config InlineBackend.figure_format = 'retina'
plt.plot(history.history['accuracy'], 'bo', label='accuracy')
plt.plot(history.history['val_accuracy'], 'b', label = 'val_accuracy')
plt.xlabel('Epoch')
plt.ylabel('Accuracy')
plt.ylim([0.5, 1])
plt.legend(loc='lower right')
plt.show()
Epoch 1/10
1875/1875 [==============================] - 28s 15ms/step - loss: 0.4641 - accuracy: 0.8272 - val_loss: 0.3534 - val_accuracy: 0.8739
Epoch 2/10
1875/1875 [==============================] - 29s 15ms/step - loss: 0.3014 - accuracy: 0.8892 - val_loss: 0.2999 - val_accuracy: 0.8895
Epoch 3/10
1875/1875 [==============================] - 31s 17ms/step - loss: 0.2567 - accuracy: 0.9042 - val_loss: 0.2805 - val_accuracy: 0.8973
Epoch 4/10
1875/1875 [==============================] - 36s 19ms/step - loss: 0.2272 - accuracy: 0.9142 - val_loss: 0.2588 - val_accuracy: 0.9043
Epoch 5/10
1875/1875 [==============================] - 36s 19ms/step - loss: 0.2009 - accuracy: 0.9230 - val_loss: 0.2750 - val_accuracy: 0.9048
Epoch 6/10
1875/1875 [==============================] - 36s 19ms/step - loss: 0.1784 - accuracy: 0.9324 - val_loss: 0.2757 - val_accuracy: 0.9083
Epoch 7/10
1875/1875 [==============================] - 35s 18ms/step - loss: 0.1576 - accuracy: 0.9408 - val_loss: 0.2755 - val_accuracy: 0.9069
Epoch 8/10
1875/1875 [==============================] - 37s 20ms/step - loss: 0.1409 - accuracy: 0.9462 - val_loss: 0.2916 - val_accuracy: 0.9060
Epoch 9/10
1875/1875 [==============================] - 34s 18ms/step - loss: 0.1277 - accuracy: 0.9509 - val_loss: 0.3154 - val_accuracy: 0.9096
Epoch 10/10
1875/1875 [==============================] - 32s 17ms/step - loss: 0.1111 - accuracy: 0.9575 - val_loss: 0.3339 - val_accuracy: 0.9104


网站公告

欢迎关注微信公众号

今日签到

点亮在社区的每一天
签到