Keras注意机制
Keras的多对一注意力机制。
通过pip安装
pip install attention
导入源代码
from attention import Attention
# [...]
m = Sequential ([
LSTM ( 128 , input_shape = ( seq_length , 1 ), return_sequences = True ),
Attention (), # <--------- here.
Dense ( 1 , activation = 'linear' )
])
例子
在运行示例之前,请先
1