Squeeze-and-Excitation Networks

SENet介绍

卷积神经网络(CNNs)的核心模块是卷积操作,这个操作使得网络能够通过每层的局部感受野融合空间和通道的信息,来构建有信息的特征。之前大量的工作已经研究了这种关系的空间组成部分,试图通过提高整个特征层次中空间编码的质量来增强CNN的表征能力。在这项工作中,作者将重点放在通道关系上,并且提出一个新的构架单元,成为“Squeeze-and-Excitation”(SE)块,通过明确地建模通道之间的相互依赖性来自适应地重新校准通道方面的特征响应。

Squeeze-and-Excitation Blocks

Squeeze: 全局信息嵌入

Excition: 适应性地校准

实例化到ResNet和Inception

代码

Caffe

Caffe SENet

第三方实现

  1. Caffe. SE-mudolues are integrated with a modificated ResNet-50 using a stride 2 in the 3x3 convolution instead of the first 1x1 convolution which obtains better performance: Repository.
  2. TensorFlow. SE-modules are integrated with a pre-activation ResNet-50 which follows the setup in fb.resnet.torch: Repository.
  3. TensorFlow. Simple Tensorflow implementation of SENets using Cifar10: Repository.
  4. MatConvNet. All the released SENets are imported into MatConvNet: Repository.
  5. MXNet. SE-modules are integrated with the ResNeXt and more architectures are coming soon: Repository.
  6. PyTorch. Implementation of SENets by PyTorch: Repository.
  7. Chainer. Implementation of SENets by Chainer: Repository.

Pytorch实现SE模块

来自https://github.com/moskomule/senet.pytorch/blob/master/senet/se_module.py的se_module.py文件

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20

from torch import nn


class SELayer(nn.Module):
def __init__(self, channel, reduction=16):
super(SELayer, self).__init__()
self.avg_pool = nn.AdaptiveAvgPool2d(1)
self.fc = nn.Sequential(
nn.Linear(channel, channel // reduction, bias=False),
nn.ReLU(inplace=True),
nn.Linear(channel // reduction, channel, bias=False),
nn.Sigmoid()
)

def forward(self, x):
b, c, _, _ = x.size()
y = self.avg_pool(x).view(b, c)
y = self.fc(y).view(b, c, 1, 1)
return x * y.expand_as(x)