1. 程式人生 > >[深度學習] TensorFlow上實現Unet網路

[深度學習] TensorFlow上實現Unet網路

程式碼取自於 https://github.com/jakeret/tf_unet
TensorFlow Unet文件 https://tf-unet.readthedocs.io/en/latest/installation.html

TensorFlow Unet安裝

確保Tensorflow已安裝,如果沒有,請參考Tensorflow安裝說明 link

  • 克隆github工程 git clone https://github.com/jakeret/tf_unet.git
  • 安裝package
    • $ cd tf_unet
    • $ pip install -r requirements.txt
    • $ python setup.py install --user
      在這裡插入圖片描述

Package使用

在其他工程中使用Tensorflow Unet的一個例子

from tf_unet import unet, util, image_util

#preparing data loading
data_provider = image_util.ImageDataProvider("fishes/train/*.tif")

#setup & training
net = unet.Unet(layers=3, features_root=64, channels=1, n_class=2)
trainer = unet.Trainer(net)
path = trainer.train(data_provider, output_path, training_iters=32, epochs=100)

#verification
...
prediction = net.predict(path, data)
unet.error_rate(prediction, util.crop_to_shape(label, prediction.shape))

img = util.combine_img_prediction(data, label, prediction)
util.save_image(img, "prediction.jpg")

可以利用Tensorboard跟蹤學習的進度。tf_unet輸出相關指標資訊。
在這裡插入圖片描述

tf_unet Package的幾個模組(連結有api和source,不再贅述)

  • unet模組:link
  • image_util模組:link
  • image_util模組:link
  • layers模組:link

工程自帶示例程式(都已經親測)

都是Jupyter notebooks,方便學習

建議使用1.5.0以上版本tensorflow,本人在測試這個程式碼的時候遇到報錯AttributeError: 'module' object has no attribute 'softmax_cross_entropy_with_logits_v2'

,查了之後,發現1.4版本以下沒有這個function,如果沒有條件安裝1.5.0以上,可git checkout 0.1.0切換至0.1.0版本

from __future__ import division, print_function
%matplotlib inline
import matplotlib.pyplot as plt
import matplotlib
import numpy as np
plt.rcParams['image.cmap'] = 'gist_earth'
from tf_unet import image_gen
from tf_unet import unet
from tf_unet import util
nx = 572
ny = 572GrayScaleDataProvider(nx, ny, cnt=20)
x_test, y_test = generator(1)

fig, ax = plt.subplots(1,2, sharey=True, figsize=(8,4))
ax[0].imshow(x_test[0,...,0], aspect="auto")
ax[1].imshow(y_test[0,...,1], aspect="auto")

<matplotlib.image.AxesImage at 0x64b8750>
在這裡插入圖片描述

net = unet.Unet(channels=generator.channels, n_class=generator.n_class, layers=3, features_root=16)
2018-10-12 11:39:22,485 Layers 3, features 16, filter size 3x3, pool size: 2x2
trainer = unet.Trainer(net, optimizer="momentum", opt_kwargs=dict(momentum=0.2))
path = trainer.train(generator, "./unet_trained", training_iters=20, epochs=10, display_step=2)
2018-10-12 11:39:29,649 Removing '/home/songruoning/tf_unet/prediction'
2018-10-12 11:39:29,660 Removing '/home/songruoning/tf_unet/unet_trained'
2018-10-12 11:39:29,707 Allocating '/home/songruoning/tf_unet/prediction'
2018-10-12 11:39:29,712 Allocating '/home/songruoning/tf_unet/unet_trained'
2018-10-12 11:39:36,322 Verification error= 84.5%, loss= 0.7055
2018-10-12 11:39:37,783 Start optimization
2018-10-12 11:39:39,371 Iter 0, Minibatch Loss= 0.6087, Training Accuracy= 0.8474, Minibatch error= 15.3%
2018-10-12 11:39:40,313 Iter 2, Minibatch Loss= 0.5519, Training Accuracy= 0.8113, Minibatch error= 18.9%
2018-10-12 11:39:41,311 Iter 4, Minibatch Loss= 0.5442, Training Accuracy= 0.7797, Minibatch error= 22.0%
2018-10-12 11:39:42,400 Iter 6, Minibatch Loss= 0.4557, Training Accuracy= 0.8398, Minibatch error= 16.0%
2018-10-12 11:39:43,265 Iter 8, Minibatch Loss= 0.4258, Training Accuracy= 0.8523, Minibatch error= 14.8%
2018-10-12 11:39:44,239 Iter 10, Minibatch Loss= 0.4505, Training Accuracy= 0.8334, Minibatch error= 16.7%
......
2018-10-12 11:41:30,394 Iter 178, Minibatch Loss= 0.2114, Training Accuracy= 0.9058, Minibatch error= 9.4%
2018-10-12 11:41:30,749 Epoch 8, Average loss: 0.2805, learning rate: 0.1327
2018-10-12 11:41:30,940 Verification error= 7.3%, loss= 0.2117
2018-10-12 11:41:33,629 Iter 180, Minibatch Loss= 0.1934, Training Accuracy= 0.9368, Minibatch error= 6.3%
2018-10-12 11:41:34,705 Iter 182, Minibatch Loss= 0.1574, Training Accuracy= 0.9487, Minibatch error= 5.1%
2018-10-12 11:41:35,795 Iter 184, Minibatch Loss= 0.1452, Training Accuracy= 0.9536, Minibatch error= 4.6%
2018-10-12 11:41:36,947 Iter 186, Minibatch Loss= 0.1582, Training Accuracy= 0.9421, Minibatch error= 5.8%
2018-10-12 11:41:37,928 Iter 188, Minibatch Loss= 0.1466, Training Accuracy= 0.9388, Minibatch error= 6.1%
2018-10-12 11:41:39,022 Iter 190, Minibatch Loss= 0.1780, Training Accuracy= 0.9360, Minibatch error= 6.4%
2018-10-12 11:41:40,164 Iter 192, Minibatch Loss= 0.1010, Training Accuracy= 0.9702, Minibatch error= 3.0%
2018-10-12 11:41:41,344 Iter 194, Minibatch Loss= 0.2422, Training Accuracy= 0.9180, Minibatch error= 8.2%
2018-10-12 11:41:42,465 Iter 196, Minibatch Loss= 0.1394, Training Accuracy= 0.9664, Minibatch error= 3.4%
2018-10-12 11:41:43,670 Iter 198, Minibatch Loss= 0.1302, Training Accuracy= 0.9575, Minibatch error= 4.2%
2018-10-12 11:41:44,016 Epoch 9, Average loss: 0.1315, learning rate: 0.1260
2018-10-12 11:41:44,713 Verification error= 3.9%, loss= 0.1573
2018-10-12 11:41:46,402 Optimization Finished!
x_test, y_test = generator(1)
prediction = net.predict("./unet_trained/model.cpkt", x_test)
INFO:tensorflow:Restoring parameters from ./unet_trained/model.cpkt
2018-10-12 11:42:14,147 Restoring parameters from ./unet_trained/model.cpkt
2018-10-12 11:42:15,091 Model restored from file: ./unet_trained/model.cpkt
fig, ax = plt.subplots(1, 3, sharex=True, sharey=True, figsize=(12,5))
ax[0].imshow(x_test[0,...,0], aspect="auto")
ax[1].imshow(y_test[0,...,1], aspect="auto")
mask = prediction[0,...,1] > 0.9
ax[2].imshow(mask, aspect="auto")
ax[0].set_title("Input")
ax[1].set_title("Ground truth")
ax[2].set_title("Prediction")
fig.tight_layout()
fig.savefig("docs/toy_problem.png")

在這裡插入圖片描述

  • 無線電頻率干擾的監測 link
    在這裡插入圖片描述

程式碼取自於 https://github.com/jakeret/tf_unet
TensorFlow Unet文件 https://tf-unet.readthedocs.io/en/latest/installation.html