What is: Convolutional Block Attention Module?
Source | CBAM: Convolutional Block Attention Module |
Year | 2000 |
Data Source | CC BY-SA - https://paperswithcode.com |
Convolutional Block Attention Module (CBAM) is an attention module for convolutional neural networks. Given an intermediate feature map, the module sequentially infers attention maps along two separate dimensions, channel and spatial, then the attention maps are multiplied to the input feature map for adaptive feature refinement.
Given an intermediate feature map as input, CBAM sequentially infers a 1D channel attention map and a 2D spatial attention map . The overall attention process can be summarized as:
During multiplication, the attention values are broadcasted (copied) accordingly: channel attention values are broadcasted along the spatial dimension, and vice versa. is the final refined output.