人工智能-深度学习-注意力-基于attention的LSTM/Dense implemented by Keras

上传者: admin_maxin | 上传时间: 2022-05-13 09:08:47 | 文件大小: 1.26MB | 文件类型: ZIP
人工智能-深度学习-注意力-基于attention的LSTM/Dense implemented by Keras X = Input Sequence of length n. H = LSTM(X); Note that here the LSTM has return_sequences = True, so H is a sequence of vectors of length n. s is the hidden state of the LSTM (h and c) h is a weighted sum over H: 加权和 h = sigma(j = 0 to n-1) alpha(j) * H(j) weight alpha[i, j] for each hj is computed as follows: H = [h1,h2,...,hn] M = tanh(H) alhpa = softmax(w.transpose * M) h# = tanh(h) y = softmax(W * h# + b) J(theta) = negative

文件下载

资源详情

[{"title":"( 62 个子文件 1.26MB ) 人工智能-深度学习-注意力-基于attention的LSTM/Dense implemented by Keras","children":[{"title":"LSTM_Attention-master","children":[{"title":"attention_lstm.py <span style='color:#111;'> 8.77KB </span>","children":null,"spread":false},{"title":"attModel1","children":[{"title":"custom_recurrents_test.py <span style='color:#111;'> 3.64KB </span>","children":null,"spread":false},{"title":"custom_recurrents.py <span style='color:#111;'> 14.58KB </span>","children":null,"spread":false},{"title":"__pycache__","children":[{"title":"custom_current.cpython-35.pyc <span style='color:#111;'> 6.47KB </span>","children":null,"spread":false},{"title":"tdd.cpython-35.pyc <span style='color:#111;'> 1.68KB </span>","children":null,"spread":false},{"title":"custom_recurrents.cpython-35.pyc <span style='color:#111;'> 7.41KB </span>","children":null,"spread":false}],"spread":true},{"title":"1.png <span style='color:#111;'> 78.73KB </span>","children":null,"spread":false},{"title":"tdd.py <span style='color:#111;'> 1.78KB </span>","children":null,"spread":false},{"title":"custom_recurrents_self.py <span style='color:#111;'> 9.65KB </span>","children":null,"spread":false},{"title":"1409.0473.pdf <span style='color:#111;'> 434.06KB </span>","children":null,"spread":false},{"title":"README.md <span style='color:#111;'> 2.52KB </span>","children":null,"spread":false}],"spread":true},{"title":"attModel4_yyl","children":[{"title":"NAM.png <span style='color:#111;'> 73.45KB </span>","children":null,"spread":false},{"title":"customize_layer.py <span style='color:#111;'> 5.33KB </span>","children":null,"spread":false},{"title":"formula.png <span style='color:#111;'> 37.26KB </span>","children":null,"spread":false},{"title":"README.md <span style='color:#111;'> 159B </span>","children":null,"spread":false}],"spread":true},{"title":"hierarchical-attention_synthesio","children":[{"title":"1.png <span style='color:#111;'> 81.70KB </span>","children":null,"spread":false},{"title":"model.py <span style='color:#111;'> 5.64KB </span>","children":null,"spread":false},{"title":"README.md <span style='color:#111;'> 498B </span>","children":null,"spread":false},{"title":"imdb_train.py <span style='color:#111;'> 1.42KB </span>","children":null,"spread":false}],"spread":true},{"title":"attModel2","children":[{"title":"attention_lstm.py <span style='color:#111;'> 3.41KB </span>","children":null,"spread":false},{"title":"__pycache__","children":[{"title":"attention_utils.cpython-35.pyc <span style='color:#111;'> 3.09KB </span>","children":null,"spread":false}],"spread":true},{"title":"attention_utils.py <span style='color:#111;'> 2.55KB </span>","children":null,"spread":false},{"title":"attention.png <span style='color:#111;'> 427.01KB </span>","children":null,"spread":false},{"title":"README.md <span style='color:#111;'> 586B </span>","children":null,"spread":false}],"spread":true},{"title":"attModel1-luoling","children":[{"title":"attention_recurrent.py <span style='color:#111;'> 10.40KB </span>","children":null,"spread":false},{"title":"5.png <span style='color:#111;'> 11.12KB </span>","children":null,"spread":false},{"title":"1.png <span style='color:#111;'> 29.95KB </span>","children":null,"spread":false},{"title":"resAtt.png <span style='color:#111;'> 82.95KB </span>","children":null,"spread":false},{"title":"attentionLayer_luoling.py <span style='color:#111;'> 8.87KB </span>","children":null,"spread":false},{"title":"tdd.py <span style='color:#111;'> 1.78KB </span>","children":null,"spread":false},{"title":"2.png <span style='color:#111;'> 47.03KB </span>","children":null,"spread":false},{"title":"4.png <span style='color:#111;'> 13.81KB </span>","children":null,"spread":false},{"title":"README.md <span style='color:#111;'> 201B </span>","children":null,"spread":false},{"title":"3.png <span style='color:#111;'> 11.42KB </span>","children":null,"spread":false}],"spread":true},{"title":"attModel3(error)","children":[{"title":"attention_wrapper_test.py <span style='color:#111;'> 3.74KB </span>","children":null,"spread":false},{"title":"__pycache__","children":[{"title":"attention_wrapper.cpython-35.pyc <span style='color:#111;'> 4.45KB </span>","children":null,"spread":false}],"spread":true},{"title":"attention_wrapper.py <span style='color:#111;'> 4.22KB </span>","children":null,"spread":false},{"title":"README.md <span style='color:#111;'> 227B </span>","children":null,"spread":false}],"spread":true},{"title":".idea","children":[{"title":"LSTM_Attention.iml <span style='color:#111;'> 459B </span>","children":null,"spread":false},{"title":"misc.xml <span style='color:#111;'> 253B </span>","children":null,"spread":false},{"title":"vcs.xml <span style='color:#111;'> 180B </span>","children":null,"spread":false},{"title":"modules.xml <span style='color:#111;'> 280B </span>","children":null,"spread":false},{"title":"workspace.xml <span style='color:#111;'> 16.52KB </span>","children":null,"spread":false}],"spread":true},{"title":"hierarchical-attention_yyl","children":[{"title":"model_utils.py <span style='color:#111;'> 5.60KB </span>","children":null,"spread":false},{"title":"textHAN.py <span style='color:#111;'> 1.87KB </span>","children":null,"spread":false},{"title":"model.py <span style='color:#111;'> 1.39KB </span>","children":null,"spread":false}],"spread":true},{"title":"main_new.py <span style='color:#111;'> 6.08KB </span>","children":null,"spread":false},{"title":"self-attention","children":[{"title":"self_attention.py <span style='color:#111;'> 6.51KB </span>","children":null,"spread":false},{"title":"self_attention2.py <span style='color:#111;'> 4.06KB </span>","children":null,"spread":false},{"title":"self.png <span style='color:#111;'> 16.45KB </span>","children":null,"spread":false},{"title":"README.md <span style='color:#111;'> 285B </span>","children":null,"spread":false}],"spread":true},{"title":"README.md <span style='color:#111;'> 2.01KB </span>","children":null,"spread":false},{"title":"utils","children":[{"title":"__pycache__","children":[{"title":"attention_lstm.cpython-35.pyc <span style='color:#111;'> 7.43KB </span>","children":null,"spread":false},{"title":"process_data.cpython-35.pyc <span style='color:#111;'> 5.99KB </span>","children":null,"spread":false},{"title":"attention_wrapper.cpython-35.pyc <span style='color:#111;'> 4.46KB </span>","children":null,"spread":false}],"spread":false},{"title":"preprocess.py <span style='color:#111;'> 8.26KB </span>","children":null,"spread":false},{"title":"main_new.py <span style='color:#111;'> 5.16KB </span>","children":null,"spread":false},{"title":"process_data.py <span style='color:#111;'> 7.10KB </span>","children":null,"spread":false}],"spread":true},{"title":"attention is all you need","children":[{"title":"test.py <span style='color:#111;'> 1.34KB </span>","children":null,"spread":false},{"title":"attention_keras.py <span style='color:#111;'> 4.52KB </span>","children":null,"spread":false},{"title":"attention_tf.py <span style='color:#111;'> 3.51KB </span>","children":null,"spread":false},{"title":"README.md <span style='color:#111;'> 135B </span>","children":null,"spread":false}],"spread":false}],"spread":false}],"spread":true}]

评论信息

免责申明

【只为小站】的资源来自网友分享,仅供学习研究,请务必在下载后24小时内给予删除,不得用于其他任何用途,否则后果自负。基于互联网的特殊性,【只为小站】 无法对用户传输的作品、信息、内容的权属或合法性、合规性、真实性、科学性、完整权、有效性等进行实质审查;无论 【只为小站】 经营者是否已进行审查,用户均应自行承担因其传输的作品、信息、内容而可能或已经产生的侵权或权属纠纷等法律责任。
本站所有资源不代表本站的观点或立场,基于网友分享,根据中国法律《信息网络传播权保护条例》第二十二条之规定,若资源存在侵权或相关问题请联系本站客服人员,zhiweidada#qq.com,请把#换成@,本站将给予最大的支持与配合,做到及时反馈和处理。关于更多版权及免责申明参见 版权及免责申明