AutoInt: Automatic Feature Interaction Learning via Self-Attentive Neural Networks

时间:2019.08

作者:Weiping Song, Chence Shi, Zhiping Xiao, Zhijian Duan, Yewen Xu, Ming Zhang, Jian Tang

 

Abstract

使用Multi-head self attention来进行自动特征交叉学习

 

Model Architecture

AutoInt: Automatic Feature Interaction Learning via Self-Attentive Neural Networks

Embedding Layer

AutoInt: Automatic Feature Interaction Learning via Self-Attentive Neural Networks

categorical feature和numerical feature都用embedding表示

 

Multi-head Self-Attention

AutoInt: Automatic Feature Interaction Learning via Self-Attentive Neural Networks

使用multi-head self-attention把特征映射到不同子空间中,每个子空间可以学习到不同的特征组合

  1. 输入特征通过矩阵乘法线性变换为在注意力空间下的向量表示,对于每个特征AutoInt: Automatic Feature Interaction Learning via Self-Attentive Neural Networks在特定的注意力空间h中,都有3个向量表示:AutoInt: Automatic Feature Interaction Learning via Self-Attentive Neural Networks
  2. 计算AutoInt: Automatic Feature Interaction Learning via Self-Attentive Neural Networks与其他特征AutoInt: Automatic Feature Interaction Learning via Self-Attentive Neural Networks的相似度,本文使用向量内积表示:AutoInt: Automatic Feature Interaction Learning via Self-Attentive Neural Networks
  3. softmax归一化注意力分布:AutoInt: Automatic Feature Interaction Learning via Self-Attentive Neural Networks
  4. 通过加权求和的方式得到特征m以及相关的特征组成的一个新特征:AutoInt: Automatic Feature Interaction Learning via Self-Attentive Neural Networks

假设有H个注意力子空间,将每个子空间下的结果进行拼接,得到特征m的最终结果表示:AutoInt: Automatic Feature Interaction Learning via Self-Attentive Neural Networks

我们可以使用ResNet保留一些原始特征的信息留给下一层继续学习

AutoInt: Automatic Feature Interaction Learning via Self-Attentive Neural Networks

最后,将每个特征的结果拼接,计算最终的输出值

AutoInt: Automatic Feature Interaction Learning via Self-Attentive Neural Networks

 

Reference

https://zhuanlan.zhihu.com/p/53462648