易觅,Eame General,文章利器,中、英文论文写作必备。
2019-12-21 20:31:10 311KB Eame General 摘要 论文
1
完整版LCSTS数据集,由于文件大小限制,里面是一个txt,包含了下载链接
2019-12-21 20:16:48 172B LCSTS 文本摘要 自动摘要 数据集
1
关于论文“Methods for interpreting and understanding deep neural networks”的学习摘要
2019-12-21 19:57:34 3.58MB CNN可视化
1
外文文献,翻译。 用于外文文献,翻译。 外文文献,翻译。 用于外文文献,翻译。
2019-12-21 19:52:59 69KB 毕业设计
1
文摘专家万小军博士关于自动文摘的有关介绍
2019-12-21 19:34:46 871KB 自动摘要介绍
1
SHA1或MD5算法获取文件摘要值(JAVA)
2019-12-21 19:29:06 2KB SHA1 MD5 文件摘要值 JAVA
1
本压缩包包括 SHA-3 python源代码,和官方文档。 hash 结果为 标准 hash 值,注释详细,适合初学者。 本代码实现的是 SHA-3 512 hash 算法。 现列出本代码 空字符 hash 值 便于检验 ('a69f73cca23a9ac5c8b567dc185a756e97c982164fe25859e0d1dcc1475c80a615b2123af1f5f94c11e3e9402c3ac558f500199d95b6d3e301758586281dcd26') 可以用 python3 自带 hashlib 类检验
2019-12-21 19:29:05 2.4MB SHA-3 Keccak 源代码
1
利用python编写了完整爬虫代码,用于爬取百度搜索新闻,输入任意关键词可以爬取相关新闻,对爬取的新闻进行词频统计,分词处理后自动生成新闻摘要。附有完整爬虫、摘要生成及分词处理代码,另有使用说明备注。
2019-12-21 19:27:24 73KB python爬虫 网络爬虫 百度新闻 NLP
1
从最基础的统计方法到前沿的应用深度学习、强化学习的文档摘要方法。还包括性能优化策略。(附:开源代码)
2019-12-21 19:27:04 4.09MB 文档摘要 性能优化 标题生成 深度学习
1
文档中包含网盘的地址,数据共319M NLP方向文本摘要,文本分类,等方向可采纳! The LCSTS dataset includes two parts: /DATA: 1. PART I: is the main contents of LCSTS that contains 2,400,591 (short text, summary) pairs. It can be used to train supervised learning models for summary generation. 2. PART II: contains 10,666 human labled (short text, summary) pairs which can be used to train classifier to filter the noises of the PART I. 3. PART III: contains 1,106 (short text, summary) pairs, this part is labled by 3 persons with the same labels. These pairs with score 3,4 and 5 can be used as test set for evaluating summary generation systems. /Result: 1.sumary.generated.char.context.txt: contains the summary generated by using RNN+context on the character based input. 2.sumary.generated.char.nocontext.txt: contains the summary generated by using RNN+nocontext on the character based input. 3.sumary.generated.word.context.txt: contains the summary generated by using RNN+context on the word based input. 4.sumary.generated.word.nocontext.txt: contains the summary generated by using RNN+nocontext on the word based input. 5.weibo.txt: contains the weibo of the test set. 6.sumary.human: contains the sumaries corresponding to 'weibo.txt' written by human. This part is the test set of the paper. 7. rouge.char_context.txt: the rouge metric on sumary.generated.char.context 8. rouge.char_nocontext.txt:the rouge metric on sumary.generated.char.nocontext 9. rouge.word_context.txt: the rouge metric on sumary.generated.word.context 10. rouge.word_nocontext.txt:the rouge metric on sumary.generated.word.nocontext
2019-12-21 19:26:22 66B nlp
1