Image Segmentation with ResNet U-Net

Based on code in Divam Gupta's image-segmentation-keras repository.


In [1]:
import os
os.environ["CUDA_DEVICE_ORDER"] = "PCI_BUS_ID"
# os.environ["CUDA_VISIBLE_DEVICES"] = ""
os.environ["CUDA_VISIBLE_DEVICES"] = "0"

In [2]:
import keras_segmentation


Using TensorFlow backend.

In [3]:
model = keras_segmentation.models.unet.resnet50_unet(n_classes=51, 
                                                      input_height=416, 
                                                      input_width=608)

In [4]:
model.train(
    train_images = "image-seg-data/images_prepped_train/",
    train_annotations = "image-seg-data/annotations_prepped_train/",
    checkpoints_path = "model_output/image-seg-ResNet",
    epochs = 20,
    validate=True,
    val_images = "image-seg-data/images_prepped_test/",
    val_annotations = "image-seg-data/annotations_prepped_test/"
)


  6%|▌         | 22/367 [00:00<00:01, 216.76it/s]
Verifying train dataset
100%|██████████| 367/367 [00:01<00:00, 213.49it/s]
 21%|██        | 21/101 [00:00<00:00, 207.08it/s]
Dataset verified! 
Verifying val dataset
100%|██████████| 101/101 [00:00<00:00, 204.33it/s]
Dataset verified! 
Starting Epoch  0
Epoch 1/1
512/512 [==============================] - 183s - loss: 0.6805 - acc: 0.8234 - val_loss: 0.4753 - val_acc: 0.8559
saved  model_output/image-seg-ResNet.model.0
Finished Epoch 0
Starting Epoch  1
Epoch 1/1
512/512 [==============================] - 140s - loss: 0.2582 - acc: 0.9208 - val_loss: 0.2969 - val_acc: 0.9081
saved  model_output/image-seg-ResNet.model.1
Finished Epoch 1
Starting Epoch  2
Epoch 1/1
512/512 [==============================] - 140s - loss: 0.1808 - acc: 0.9416 - val_loss: 0.2757 - val_acc: 0.9178
saved  model_output/image-seg-ResNet.model.2
Finished Epoch 2
Starting Epoch  3
Epoch 1/1
512/512 [==============================] - 140s - loss: 0.1463 - acc: 0.9510 - val_loss: 0.2725 - val_acc: 0.9195
saved  model_output/image-seg-ResNet.model.3
Finished Epoch 3
Starting Epoch  4
Epoch 1/1
512/512 [==============================] - 140s - loss: 0.1222 - acc: 0.9578 - val_loss: 0.3011 - val_acc: 0.9150
saved  model_output/image-seg-ResNet.model.4
Finished Epoch 4
Starting Epoch  5
Epoch 1/1
512/512 [==============================] - 140s - loss: 0.1058 - acc: 0.9626 - val_loss: 0.3182 - val_acc: 0.9101
saved  model_output/image-seg-ResNet.model.5
Finished Epoch 5
Starting Epoch  6
Epoch 1/1
512/512 [==============================] - 140s - loss: 0.0955 - acc: 0.9656 - val_loss: 0.3255 - val_acc: 0.9176
saved  model_output/image-seg-ResNet.model.6
Finished Epoch 6
Starting Epoch  7
Epoch 1/1
512/512 [==============================] - 139s - loss: 0.0879 - acc: 0.9680 - val_loss: 0.3047 - val_acc: 0.9213
saved  model_output/image-seg-ResNet.model.7
Finished Epoch 7
Starting Epoch  8
Epoch 1/1
512/512 [==============================] - 140s - loss: 0.0807 - acc: 0.9702 - val_loss: 0.3020 - val_acc: 0.9231
saved  model_output/image-seg-ResNet.model.8
Finished Epoch 8
Starting Epoch  9
Epoch 1/1
512/512 [==============================] - 140s - loss: 0.0764 - acc: 0.9716 - val_loss: 0.3123 - val_acc: 0.9253
saved  model_output/image-seg-ResNet.model.9
Finished Epoch 9
Starting Epoch  10
Epoch 1/1
512/512 [==============================] - 140s - loss: 0.0716 - acc: 0.9732 - val_loss: 0.3855 - val_acc: 0.9100
saved  model_output/image-seg-ResNet.model.10
Finished Epoch 10
Starting Epoch  11
Epoch 1/1
512/512 [==============================] - 141s - loss: 0.0672 - acc: 0.9747 - val_loss: 0.3102 - val_acc: 0.9257
saved  model_output/image-seg-ResNet.model.11
Finished Epoch 11
Starting Epoch  12
Epoch 1/1
512/512 [==============================] - 140s - loss: 0.0641 - acc: 0.9757 - val_loss: 0.3484 - val_acc: 0.9228
saved  model_output/image-seg-ResNet.model.12
Finished Epoch 12
Starting Epoch  13
Epoch 1/1
512/512 [==============================] - 140s - loss: 0.0615 - acc: 0.9767 - val_loss: 0.3679 - val_acc: 0.9191
saved  model_output/image-seg-ResNet.model.13
Finished Epoch 13
Starting Epoch  14
Epoch 1/1
512/512 [==============================] - 140s - loss: 0.0580 - acc: 0.9779 - val_loss: 0.3641 - val_acc: 0.9208
saved  model_output/image-seg-ResNet.model.14
Finished Epoch 14
Starting Epoch  15
Epoch 1/1
512/512 [==============================] - 140s - loss: 0.0565 - acc: 0.9784 - val_loss: 0.3556 - val_acc: 0.9209
saved  model_output/image-seg-ResNet.model.15
Finished Epoch 15
Starting Epoch  16
Epoch 1/1
512/512 [==============================] - 139s - loss: 0.0549 - acc: 0.9790 - val_loss: 0.3627 - val_acc: 0.9232
saved  model_output/image-seg-ResNet.model.16
Finished Epoch 16
Starting Epoch  17
Epoch 1/1
512/512 [==============================] - 140s - loss: 0.0524 - acc: 0.9799 - val_loss: 0.3631 - val_acc: 0.9221
saved  model_output/image-seg-ResNet.model.17
Finished Epoch 17
Starting Epoch  18
Epoch 1/1
512/512 [==============================] - 139s - loss: 0.0505 - acc: 0.9806 - val_loss: 0.4280 - val_acc: 0.9138
saved  model_output/image-seg-ResNet.model.18
Finished Epoch 18
Starting Epoch  19
Epoch 1/1
512/512 [==============================] - 140s - loss: 0.0493 - acc: 0.9810 - val_loss: 0.3706 - val_acc: 0.9231
saved  model_output/image-seg-ResNet.model.19
Finished Epoch 19

In [5]:
model.load_weights('model_output/image-seg-ResNet/image-seg-ResNet.3')

In [6]:
out = model.predict_segmentation(
    inp="image-seg-data/images_prepped_test/0016E5_07965.png",
    out_fname="output-ResNet-4epochs.png"
)

In [ ]: