1. Trang chủ
  2. » Công Nghệ Thông Tin

CNN image classification

15 6 0

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Định dạng
Số trang 15
Dung lượng 161,42 KB

Nội dung

from google colab import files from google colab import drive drive mount(contentdrive) Drive already mounted at contentdrive; to attempt to forcibly remount, call drive mount(conte Convolu.from google colab import files from google colab import drive drive mount(contentdrive) Drive already mounted at contentdrive; to attempt to forcibly remount, call drive mount(conte Convolu.

 from google.colab import files from google.colab import drive drive.mount('/content/drive') Drive already mounted at /content/drive; to attempt to forcibly remount, call drive.mount("/conte ## Convolutional Neural Network  - Image Classification from tensorflow.compat.v1 import ConfigProto from tensorflow.compat.v1 import InteractiveSession   config = ConfigProto() config.gpu_options.per_process_gpu_memory_fraction = 0.5 config.gpu_options.allow_growth = True session = InteractiveSession(config=config) # Convolutional Neural Network   # Importing the libraries import tensorflow as tf from tensorflow.keras.preprocessing.image import ImageDataGenerator tf. version '2.5.0' # Part 1 - Data Preprocessing   # Preprocessing the Training set train_datagen = ImageDataGenerator(rescale = 1./255,shear_range = 0.2,zoom_range = 0.2,horizontal_flip  training_set = train_datagen.flow_from_directory('/content/chest_xray/chest_xray/train',target_size = ( Found 5216 images belonging to classes # Preprocessing the Test set test_datagen = ImageDataGenerator(rescale = 1./255) test_set = test_datagen.flow_from_directory('/content/chest_xray/chest_xray/test',target_size = (64, 64 Found 624 images belonging to classes from tensorflow.keras.layers import Conv2D from tensorflow.keras.layers import Dense from tensorflow.keras.regularizers import l2 # Part 2 - Building the CNN # Initialising the CNN cnn = tf.keras.models.Sequential()   # Step 1 - Convolution cnn.add(tf.keras.layers.Conv2D(filters=32,padding="same",kernel_size=3, activation='relu', strides=2, i   # Step 2 - Pooling cnn.add(tf.keras.layers.MaxPool2D(pool_size=2, strides=2))   # Adding a second convolutional layer cnn.add(tf.keras.layers.Conv2D(filters=32,padding='same',kernel_size=3, activation='relu')) cnn.add(tf.keras.layers.MaxPool2D(pool_size=2, strides=2))   # Step 3 - Flattening cnn.add(tf.keras.layers.Flatten())   # Step 4 - Full Connection cnn.add(tf.keras.layers.Dense(units=128, activation='relu'))   # Step 5 - Output Layer #cnn.add(tf.keras.layers.Dense(units=1, activation='sigmoid')) ## For Binary Classification cnn.add(Dense(1, kernel_regularizer=tf.keras.regularizers.l2(0.01),activation ='linear')) ## for mulitclassification cnn.add(Dense(4, kernel_regularizer=tf.keras.regularizers.l2(0.01),activation ='softmax')) cnn.compile(optimizer = 'adam', loss = 'squared_hinge', metrics = ['accuracy']) cnn.summary() Model: "sequential" _ Layer (type) Output Shape Param # ================================================================= conv2d (Conv2D) (None, 32, 32, 32) 896 _ max_pooling2d (MaxPooling2D) (None, 16, 16, 32) _ conv2d_1 (Conv2D) (None, 16, 16, 32) 9248 _ max_pooling2d_1 (MaxPooling2 (None, 8, 8, 32) _ flatten (Flatten) (None, 2048) _ dense (Dense) (None, 128) 262272 _ dense_1 (Dense) (None, 1) 129 _ dense_2 (Dense) (None, 4) ================================================================= Total params: 272,553 Trainable params: 272,553 Non-trainable params: _ # Part 3 - Training the CNN   # Compiling the CNN cnn.compile(optimizer = 'adam', loss = 'hinge', metrics = ['accuracy'])   # Training the CNN on the Training set and evaluating it on the Test set r=cnn.fit(x = training_set, validation_data = test_set, epochs = 30) 63/ 63 [ ] 85s s/step oss: 0.88 Epoch 2/30 163/163 [==============================] - 81s 499ms/step - loss: 0.8818 Epoch 3/30 163/163 [==============================] - 84s 513ms/step - loss: 0.8802 Epoch 4/30 163/163 [==============================] - 87s 533ms/step - loss: 0.8795 Epoch 5/30 163/163 [==============================] - 85s 521ms/step - loss: 0.8790 Epoch 6/30 163/163 [==============================] - 84s 517ms/step - loss: 0.8788 Epoch 7/30 163/163 [==============================] - 86s 526ms/step - loss: 0.8786 Epoch 8/30 163/163 [==============================] - 87s 531ms/step - loss: 0.8786 Epoch 9/30 163/163 [==============================] - 83s 512ms/step - loss: 0.8786 Epoch 10/30 accu acy: - accuracy: 0.7429 - v - accuracy: 0.7429 - v - accuracy: 0.5234 - v - accuracy: 0.0000e+00 - accuracy: 0.1597 - v - accuracy: 0.6183 - v - accuracy: 0.7245 - v - accuracy: 0.7429 - v 163/163 [==============================] - 83s 511ms/step - loss: 0.8785 - accuracy: 0.7429 - v Epoch 11/30 163/163 [==============================] Epoch 12/30 163/163 [==============================] Epoch 13/30 163/163 [==============================] Epoch 14/30 163/163 [==============================] Epoch 15/30 163/163 [==============================] Epoch 16/30 163/163 [==============================] Epoch 17/30 163/163 [==============================] Epoch 18/30 163/163 [==============================] Epoch 19/30 163/163 [==============================] Epoch 20/30 163/163 [==============================] Epoch 21/30 163/163 [==============================] Epoch 22/30 163/163 [==============================] Epoch 23/30 163/163 [==============================] Epoch 24/30 163/163 [==============================] Epoch 25/30 163/163 [==============================] Epoch 26/30 163/163 [==============================] - 84s 518ms/step - loss: 0.8785 - accuracy: 0.7429 - v - 86s 528ms/step - loss: 0.8785 - accuracy: 0.7429 - v - 84s 514ms/step - loss: 0.8785 - accuracy: 0.7429 - v - 82s 505ms/step - loss: 0.8785 - accuracy: 0.7429 - v - 83s 508ms/step - loss: 0.8785 - accuracy: 0.7429 - v - 85s 519ms/step - loss: 0.8785 - accuracy: 0.7429 - v - 85s 524ms/step - loss: 0.8785 - accuracy: 0.7429 - v - 83s 511ms/step - loss: 0.8785 - accuracy: 0.7429 - v - 83s 512ms/step - loss: 0.8785 - accuracy: 0.7429 - v - 83s 509ms/step - loss: 0.8785 - accuracy: 0.7429 - v - 85s 523ms/step - loss: 0.8785 - accuracy: 0.7429 - v - 85s 521ms/step - loss: 0.8785 - accuracy: 0.7429 - v - 85s 519ms/step - loss: 0.8785 - accuracy: 0.7429 - v - 86s 528ms/step - loss: 0.8785 - accuracy: 0.7429 - v - 86s 526ms/step - loss: 0.8785 - accuracy: 0.7429 - v - 85s 521ms/step - loss: 0.8785 - accuracy: 0.7429 - v Epoch 27/30 163/163 [==============================] Epoch 28/30 163/163 [==============================] Epoch 29/30 163/163 [==============================] Epoch 30/30 163/163 [==============================] - 85s 521ms/step - loss: 0.8785 - accuracy: 0.7429 - v - 86s 529ms/step - loss: 0.8785 - accuracy: 0.7429 - v - 86s 526ms/step - loss: 0.8785 - accuracy: 0.7429 - v - 84s 518ms/step - loss: 0.8785 - accuracy: 0.7429 - v score = cnn.evaluate(test_set, verbose=0) print("The Accuracy score is:", score) The Accuracy score is: [0.9375, 0.625] # plot the loss import matplotlib.pyplot as plt plt.plot(r.history['loss'], label='train loss') plt.plot(r.history['val_loss'], label='val loss') plt.legend() plt.show() plt.savefig('LossVal_loss')   # plot the accuracy plt.plot(r.history['accuracy'], label='train acc') plt.plot(r.history['val_accuracy'], label='val acc') plt.legend() plt.show() plt.savefig('AccVal_acc') # save it as a h5 file     from tensorflow.keras.models import load_model   cnn.save('model_rcat.h5') from tensorflow.keras.models import load_model   # load model model = load_model('model_rcat.h5') model.summary() Model: "sequential" _ Layer (type) Output Shape Param # ================================================================= conv2d (Conv2D) (None, 32, 32, 32) 896 _ max_pooling2d (MaxPooling2D) (None, 16, 16, 32) _ conv2d_1size (Conv2D) (None, 16, 16, 32) 9248 _ max_pooling2d_1 (MaxPooling2 (None, 8, 8, 32) _ flatten (Flatten) (None, 2048) _ dense (Dense) (None, 128) 262272 _ dense_1 (Dense) (None, 1) 129 _ dense_2 (Dense) (None, 4) ================================================================= Total params: 272,553 Trainable params: 272,553 Non-trainable params: _ # Part 4 - Making a single prediction   import numpy as np from tensorflow.keras.preprocessing import image test_image = image.load_img('/content/chest_xray/val/NORMAL/NORMAL2-IM-1427-0001.jpeg', target_size = ( test_image = image.img_to_array(test_image) test_image=test_image/255 test_image = np.expand_dims(test_image, axis = 0) result1 = cnn.predict(test_image) # Part 4 - Making a single prediction   import numpy as np from tensorflow.keras.preprocessing import image test_image = image.load_img('/content/chest_xray/val/PNEUMONIA/person1950_bacteria_4881.jpeg', target_s test_image = image.img_to_array(test_image) test_image=test_image/255 test_image = np.expand_dims(test_image, axis = 0) result2 = cnn.predict(test_image) # Part 4 - Making a single prediction   import numpy as np from tensorflow.keras.preprocessing import image test_image = image.load_img('/content/chest_xray/val/NORMAL/NORMAL2-IM-1437-0001.jpeg', target_size = ( test_image = image.img_to_array(test_image) test_image=test_image/255 test_image = np.expand_dims(test_image, axis = 0) result3 = cnn.predict(test_image) predictions1 = np.argmax(result1) predictions2 = np.argmax(result2) predictions3 = np.argmax(result3) print("The prediction That I have :", predictions1) print("The prediction That I have :", predictions2) print("The prediction That I have :", predictions3) The prediction That I have : The prediction That I have : The prediction That I have : categories = {0:'NORMAL',1:'PNEUMONIA'} print("The prediction Of the Image is : ", categories[predictions1]) print("The prediction Of the Image is : ", categories[predictions2]) print("The prediction Of the Image is : ", categories[predictions3]) The prediction Of the Image is : The prediction Of the Image is : The prediction Of the Image is : PNEUMONIA PNEUMONIA PNEUMONIA # show the image import matplotlib.pyplot as plt test_image = image.load_img('/content/chest_xray/val/PNEUMONIA/person1950_bacteria_4881.jpeg', target_s plt.axis('off') plt.imshow(test_image) plt.show() # show the image import matplotlib.pyplot as plt test_image = image.load_img('/content/chest_xray/val/NORMAL/NORMAL2-IM-1437-0001.jpeg', target_size = ( plt.axis('off') plt.imshow(test_image) plt.show()   check 0s completed at 2:20 PM ... from tensorflow.keras.preprocessing import? ?image test _image? ?=? ?image. load_img('/content/chest_xray/val/NORMAL/NORMAL2-IM-1427-0001.jpeg', target_size = ( test _image? ?=? ?image. img_to_array(test _image) test _image= test _image/ 255 test _image? ?= np.expand_dims(test _image,  axis = 0)... test _image? ?=? ?image. load_img('/content/chest_xray/val/PNEUMONIA/person1950_bacteria_4881.jpeg', target_s test _image? ?=? ?image. img_to_array(test _image) test _image= test _image/ 255 test _image? ?= np.expand_dims(test _image,  axis = 0) result2 =? ?cnn. predict(test _image) # Part 4 - Making a single prediction... from tensorflow.keras.preprocessing import? ?image test _image? ?=? ?image. load_img('/content/chest_xray/val/NORMAL/NORMAL2-IM-1437-0001.jpeg', target_size = ( test _image? ?=? ?image. img_to_array(test _image) test _image= test _image/ 255 test _image? ?= np.expand_dims(test _image,  axis = 0)

Ngày đăng: 09/09/2022, 10:05

w