前言
对UNet不了解的,可以参看动手实现基于pytorch框架的UNet模型对resnet不熟悉的同学可以参考经典网络架构学习-ResNet
enhanced UNet VS Basic UNet
- 卷积部分全部换成残差块链接
- 激活层(PReLU).
- 加入了Dropout layers (Dropout).
- 归化层使用(InstanceNorm3d).
- 卷积层主要使用 (Conv and ConvTranspose).
网络架构分析
从代码分析可以看出有三部分:
- The first down layer.
- The intermediate skip connection based block.
- The final up layer.
实现参考Left-Ventricle Quantification Using Residual U-Net
首先,让我们建立一个UNet实例来检查其结构。num_res_units被设置为2,num_res_units设置每层使用几个残差单元,以下代码使用MONAI构建。
from monai.networks.nets import UNet
from torchinfo import summary
# 3 layer network with down/upsampling by a factor of 2 at each layer with 2-convolution residual units
net = UNet(
spatial_dims=2,
in_channels=1,
out_channels=1,
channels=(4, 8, 16),
strides=(2, 2),
num_res_units=2
)
print(net)
summary(net,(1,1,224,224))
UNet(
(model): Sequential(
(0): ResidualUnit(
(conv): Sequential(
(unit0): Convolution(
(conv): Conv2d(1, 4, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1))
(adn): ADN(
(N): InstanceNorm2d(4, eps=1e-05, momentum=0.1, affine=False, track_running_stats=False)
(D): Dropout(p=0.0, inplace=False)
(A): PReLU(num_parameters=1)
)
)
(unit1): Convolution(
(conv): Conv2d(4, 4, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
(adn): ADN(
(N): InstanceNorm2d(4, eps=1e-05, momentum=0.1, affine=False, track_running_stats=False)
(D): Dropout(p=0.0, inplace=False)
(A): PReLU(num_parameters=1)
)
)
)
(residual): Conv2d(1, 4, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1))
)
(1): SkipConnection(
(submodule): Sequential(
(0): ResidualUnit(
(conv): Sequential(
(unit0): Convolution(
(conv): Conv2d(4, 8, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1))
(adn): ADN(
(N): InstanceNorm2d(8, eps=1e-05, momentum=0.1, affine=False, track_running_stats=False)
(D): Dropout(p=0.0, inplace=False)
(A): PReLU(num_parameters=1)
)
)
(unit1): Convolution(
(conv): Conv2d(8, 8, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
(adn): ADN(
(N): InstanceNorm2d(8, eps=1e-05, momentum=0.1, affine=False, track_running_stats=False)
(D): Dropout(p=0.0, inplace=False)
(A): PReLU(num_parameters=1)
)
)
)
(residual): Conv2d(4, 8, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1))
)
(1): SkipConnection(
(submodule): ResidualUnit(
(conv): Sequential(
(unit0): Convolution(
(conv): Conv2d(8, 16, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
(adn): ADN(
(N): InstanceNorm2d(16, eps=1e-05, momentum=0.1, affine=False, track_running_stats=False)
(D): Dropout(p=0.0, inplace=False)
(A): PReLU(num_parameters=1)
)
)
(unit1): Convolution(
(conv): Conv2d(16, 16, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
(adn): ADN(
(N): InstanceNorm2d(16, eps=1e-05, momentum=0.1, affine=False, track_running_stats=False)
(D): Dropout(p=0.0, inplace=False)
(A): PReLU(num_parameters=1)
)
)
)
(residual): Conv2d(8, 16, kernel_size=(1, 1), stride=(1, 1))
)
)
(2): Sequential(
(0): Convolution(
(conv): ConvTranspose2d(24, 4, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), output_padding=(1, 1))
(adn): ADN(
(N): InstanceNorm2d(4, eps=1e-05, momentum=0.1, affine=False, track_running_stats=False)
(D): Dropout(p=0.0, inplace=False)
(A): PReLU(num_parameters=1)
)
)
(1): ResidualUnit(
(conv): Sequential(
(unit0): Convolution(
(conv): Conv2d(4, 4, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
(adn): ADN(
(N): InstanceNorm2d(4, eps=1e-05, momentum=0.1, affine=False, track_running_stats=False)
(D): Dropout(p=0.0, inplace=False)
(A): PReLU(num_parameters=1)
)
)
)
(residual): Identity()
)
)
)
)
(2): Sequential(
(0): Convolution(
(conv): ConvTranspose2d(8, 1, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), output_padding=(1, 1))
(adn): ADN(
(N): InstanceNorm2d(1, eps=1e-05, momentum=0.1, affine=False, track_running_stats=False)
(D): Dropout(p=0.0, inplace=False)
(A): PReLU(num_parameters=1)
)
)
(1): ResidualUnit(
(conv): Sequential(
(unit0): Convolution(
(conv): Conv2d(1, 1, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
)
)
(residual): Identity()
)
)
)
)
===================================================================================================================
Layer (type:depth-idx) Output Shape Param #
===================================================================================================================
UNet [1, 1, 224, 224] --
├─Sequential: 1-1 [1, 1, 224, 224] --
│ └─ResidualUnit: 2-1 [1, 4, 112, 112] --
│ │ └─Conv2d: 3-1 [1, 4, 112, 112] 40
│ │ └─Sequential: 3-2 [1, 4, 112, 112] 190
│ └─SkipConnection: 2-2 [1, 8, 112, 112] --
│ │ └─Sequential: 3-3 [1, 4, 112, 112] 5,830
│ └─Sequential: 2-3 [1, 1, 224, 224] --
│ │ └─Convolution: 3-4 [1, 1, 224, 224] 74
│ │ └─ResidualUnit: 3-5 [1, 1, 224, 224] 10
===================================================================================================================
Total params: 6,144
Trainable params: 6,144
Non-trainable params: 0
Total mult-adds (M): 34.85
===================================================================================================================
Input size (MB): 0.20
Forward/backward pass size (MB): 7.83
Params size (MB): 0.02
Estimated Total Size (MB): 8.05
===================================================================================================================
模型结构中Convolution 是MONAI自己封装的卷积层结构
ADN:构建一个由可选的激活层(A)、剔除层(D)和归一化层(N)组成的顺序模块
Convolution
构建一个带有归一化的卷积,可选的滤波,和可选的激活层。
– (Conv|ConvTrans) – (Norm – Dropout – Acti) –
example:
from monai.networks.blocks import Convolution
conv = Convolution(
spatial_dims=3,
in_channels=1,
out_channels=1,
adn_ordering="ADN",
act=("prelu", {
"init": 0.2}),
dropout=0.1,
norm=("layer", {
"normalized_shape": (10, 10, 10)}),
)
print(conv)
output:
Convolution(
(conv): Conv3d(1, 1, kernel_size=(3, 3, 3), stride=(1, 1, 1), padding=(1, 1, 1))
(adn): ADN(
(A): PReLU(num_parameters=1)
(D): Dropout(p=0.1, inplace=False)
(N): LayerNorm((10, 10, 10), eps=1e-05, elementwise_affine=True)
)
)
ADN 模块
# activation, group norm, dropout
>>> norm_params = ("GROUP", {
"num_groups": 1, "affine": False})
>>> ADN(norm=norm_params, in_channels=1, dropout_dim=1, dropout=0.8, ordering="AND")
ADN(
(A): ReLU()
(N): GroupNorm(1, 1, eps=1e-05, affine=False)
(D): Dropout(p=0.8, inplace=False)
)
# LeakyReLU, dropout
>>> act_params = ("leakyrelu", {
"negative_slope": 0.1, "inplace": True})
>>> ADN(act=act_params, in_channels=1, dropout_dim=1, dropout=0.8, ordering="AD")
ADN(
(A): LeakyReLU(negative_slope=0.1, inplace=True)
(D): Dropout(p=0.8, inplace=False)
)
ResidualUnit
残差单元实现
example:
from monai.networks.blocks import ResidualUnit
convs = ResidualUnit(
spatial_dims=3,
in_channels=1,
out_channels=1,
adn_ordering="AN",
act=("prelu", {
"init": 0.2}),
norm=("layer", {
"normalized_shape": (10, 10, 10)}),
)
print(convs)
output:
ResidualUnit(
(conv): Sequential(
(unit0): Convolution(
(conv): Conv3d(1, 1, kernel_size=(3, 3, 3), stride=(1, 1, 1), padding=(1, 1, 1))
(adn): ADN(
(A): PReLU(num_parameters=1)
(N): LayerNorm((10, 10, 10), eps=1e-05, elementwise_affine=True)
)
)
(unit1): Convolution(
(conv): Conv3d(1, 1, kernel_size=(3, 3, 3), stride=(1, 1, 1), padding=(1, 1, 1))
(adn): ADN(
(A): PReLU(num_parameters=1)
(N): LayerNorm((10, 10, 10), eps=1e-05, elementwise_affine=True)
)
)
)
(residual): Identity()
)
参考链接
https://github.com/Project-MONAI/tutorials/blob/main/modules/UNet_input_size_constrains.ipynb
convolutions.py
acti_norm.py