Prompt相关论文合集

上传者: 43902773 | 上传时间: 2023-04-11 16:58:05 | 文件大小: 81.9MB | 文件类型: ZIP
Prompt tuning 目前的相关论文合集,总计70篇左右

文件下载

资源详情

[{"title":"( 65 个子文件 81.9MB ) Prompt相关论文合集","children":[{"title":"PromptPapers","children":[{"title":"Overview","children":[{"title":"Pre-train, Prompt, and Predict A Systematic Survey of.pdf <span style='color:#111;'> 11.76MB </span>","children":null,"spread":false},{"title":"OpenPrompt An Open-source Framework for Prompt-learning.pdf <span style='color:#111;'> 296.16KB </span>","children":null,"spread":false},{"title":"Pre-Trained Models Past, Present and Future.pdf <span style='color:#111;'> 2.18MB </span>","children":null,"spread":false},{"title":"Paradigm Shift in Natural Language Processing.pdf <span style='color:#111;'> 843.59KB </span>","children":null,"spread":false}],"spread":true},{"title":"Improvements","children":[{"title":"Revisiting Self-Training for Few-Shot Learning of Language Model.pdf <span style='color:#111;'> 561.45KB </span>","children":null,"spread":false},{"title":"Knowledgeable Prompt-tuning Incorporating Knowledge into Prompt Verbalizer for Text Classification.pdf <span style='color:#111;'> 440.93KB </span>","children":null,"spread":false},{"title":"Text Generation with Efficient (Soft) Q-Learning.pdf <span style='color:#111;'> 3.51MB </span>","children":null,"spread":false},{"title":"Adapting Language Models for Zero-shot Learning by Meta-tuning on.pdf <span style='color:#111;'> 1.26MB </span>","children":null,"spread":false},{"title":"Noisy Channel Language Model Prompting.pdf <span style='color:#111;'> 1.46MB </span>","children":null,"spread":false},{"title":"Calibrate Before Use Improving Few-Shot Performance of Language Models.pdf <span style='color:#111;'> 679.36KB </span>","children":null,"spread":false}],"spread":true},{"title":"P-Tuning v2 Prompt Tuning Can Be Comparable to Fine-tuning Universally Across Scales and Tasks.pdf <span style='color:#111;'> 712.60KB </span>","children":null,"spread":false},{"title":"Pilot Work","children":[{"title":"Exploring the Limits of Transfer Learning with a Unified.pdf <span style='color:#111;'> 1.11MB </span>","children":null,"spread":false},{"title":"Language Models are Few-Shot Learners.pdf <span style='color:#111;'> 6.45MB </span>","children":null,"spread":false},{"title":"Parameter-Efficient Transfer Learning for NLP.pdf <span style='color:#111;'> 707.47KB </span>","children":null,"spread":false},{"title":"How Can We Know What Language Models Know.pdf <span style='color:#111;'> 465.76KB </span>","children":null,"spread":false},{"title":"Language Models as Knowledge Bases.pdf <span style='color:#111;'> 335.19KB </span>","children":null,"spread":false}],"spread":true},{"title":"Analysis","children":[{"title":"What Makes Good In-Context Examples for GPT-3.pdf <span style='color:#111;'> 400.99KB </span>","children":null,"spread":false},{"title":"Cross-Task Generalization via Natural Language Crowdsourcing Instructions.pdf <span style='color:#111;'> 1.70MB </span>","children":null,"spread":false},{"title":"Surface Form Competition Why the Highest Probability Answer Isn’t Always Right.pdf <span style='color:#111;'> 4.45MB </span>","children":null,"spread":false},{"title":"Do Prompt-Based Models Really Understand.pdf <span style='color:#111;'> 653.07KB </span>","children":null,"spread":false},{"title":"True Few-Shot Learning with Language Models.pdf <span style='color:#111;'> 902.19KB </span>","children":null,"spread":false},{"title":"Fantastically Ordered Prompts and Where to Find Them.pdf <span style='color:#111;'> 513.33KB </span>","children":null,"spread":false},{"title":"Exploring Low-dimensional Intrinsic Task Subspace via Prompt Tuning.pdf <span style='color:#111;'> 1.28MB </span>","children":null,"spread":false},{"title":"Adapting Language Models for Zero-shot Learning by Meta-tuning on.pdf <span style='color:#111;'> 1.26MB </span>","children":null,"spread":false},{"title":"TOWARDS A UNIFIED VIEW OF PARAMETER-EFFICIENT TRANSFER LEARNING.pdf <span style='color:#111;'> 1.80MB </span>","children":null,"spread":false},{"title":"How Many Data Points is a Prompt Worth.pdf <span style='color:#111;'> 1.05MB </span>","children":null,"spread":false},{"title":"Avoiding Inference Heuristics in Few-shot Prompt-based Finetuning.pdf <span style='color:#111;'> 621.70KB </span>","children":null,"spread":false},{"title":"Why Do Pretrained Language Models Help in Downstream.pdf <span style='color:#111;'> 1.01MB </span>","children":null,"spread":false}],"spread":false},{"title":"Specializations","children":[{"title":"GPT3Mix Leveraging Large-scale Language Models for Text Augmentation.pdf <span style='color:#111;'> 509.97KB </span>","children":null,"spread":false},{"title":"Exploring Prompt-based Few-shot Learning for Grounded Dialog Generation.pdf <span style='color:#111;'> 12.20MB </span>","children":null,"spread":false},{"title":"CONTROL PREFIXES for Text Generation.pdf <span style='color:#111;'> 925.86KB </span>","children":null,"spread":false},{"title":"Template-free Prompt Tuning for Few-shot NER.pdf <span style='color:#111;'> 413.76KB </span>","children":null,"spread":false},{"title":"PADA A Prompt-based Autoregressive Approach for Adaptation to Unseen Domains.pdf <span style='color:#111;'> 499.60KB </span>","children":null,"spread":false},{"title":"LEARNING TO PROMPT FOR VISION-LANGUAGE MODELS.pdf <span style='color:#111;'> 2.14MB </span>","children":null,"spread":false},{"title":"Few-Shot Bot Prompt-Based Learning for Dialogue Systems.pdf <span style='color:#111;'> 7.16MB </span>","children":null,"spread":false},{"title":"A Good Prompt Is Worth Millions of Parameters.pdf <span style='color:#111;'> 930.28KB </span>","children":null,"spread":false},{"title":"Prompt-Learning for Fine-Grained Entity Typing.pdf <span style='color:#111;'> 1.48MB </span>","children":null,"spread":false},{"title":"MSP Multi-Stage Prompting for Making Pre-trained Language Models Better Translators.pdf <span style='color:#111;'> 438.07KB </span>","children":null,"spread":false},{"title":"KnowPrompt Knowledge-aware Prompt-tuning with Synergistic Optimization for Relation Extraction.pdf <span style='color:#111;'> 3.51MB </span>","children":null,"spread":false},{"title":"Label Verbalization and Entailment for Effective Zero- and Few-Shot Relation Extraction.pdf <span style='color:#111;'> 628.48KB </span>","children":null,"spread":false},{"title":"SentiPrompt Sentiment Knowledge Enhanced Prompt-Tuning for Aspect-Based Sentiment Analysis.pdf <span style='color:#111;'> 837.29KB </span>","children":null,"spread":false},{"title":"The Power of Prompt Tuning for Low-Resource Semantic Parsing.pdf <span style='color:#111;'> 742.91KB </span>","children":null,"spread":false},{"title":"Thinking Aloud Dynamic Context Generation Improves Zero-Shot.pdf <span style='color:#111;'> 961.13KB </span>","children":null,"spread":false},{"title":"Constrained Language Models Yield Few-Shot Semantic Parsers.pdf <span style='color:#111;'> 799.79KB </span>","children":null,"spread":false},{"title":"CPT COLORFUL PROMPT TUNING FOR PRE-TRAINED VISION-LANGUAGE MODELS.pdf <span style='color:#111;'> 999.66KB </span>","children":null,"spread":false}],"spread":false},{"title":"Basics","children":[{"title":"Cutting Down on Prompts and Parameters.pdf <span style='color:#111;'> 641.48KB </span>","children":null,"spread":false},{"title":"The Power of Scale for Parameter-Efficient Prompt Tuning.pdf <span style='color:#111;'> 534.93KB </span>","children":null,"spread":false},{"title":"Improving and Simplifying Pattern Exploiting Training.pdf <span style='color:#111;'> 517.80KB </span>","children":null,"spread":false},{"title":"Exploiting Cloze Questions for Few Shot Text Classification and Natural.pdf <span style='color:#111;'> 474.84KB </span>","children":null,"spread":false},{"title":"Factual Probing Is [MASK] Learning vs. Learning to Recall.pdf <span style='color:#111;'> 908.10KB </span>","children":null,"spread":false},{"title":"NSP-BERT A Prompt-based Zero-Shot Learner.pdf <span style='color:#111;'> 659.16KB </span>","children":null,"spread":false},{"title":"AUTOPROMPT Eliciting Knowledge from Language Models.pdf <span style='color:#111;'> 647.64KB </span>","children":null,"spread":false},{"title":"DIFFERENTIABLE PROMPT MAKES PRE-TRAINED.pdf <span style='color:#111;'> 1.04MB </span>","children":null,"spread":false},{"title":"MULTITASK PROMPTED TRAINING ENABLES.pdf <span style='color:#111;'> 3.11MB </span>","children":null,"spread":false},{"title":"FINETUNED LANGUAGE MODELS.pdf <span style='color:#111;'> 1.19MB </span>","children":null,"spread":false},{"title":"Prompt Programming for Large Language Models.pdf <span style='color:#111;'> 182.24KB </span>","children":null,"spread":false},{"title":"PPT Pre-trained Prompt Tuning for Few-shot Learning.pdf <span style='color:#111;'> 519.88KB </span>","children":null,"spread":false},{"title":"Prefix-Tuning Optimizing Continuous Prompts for Generation.pdf <span style='color:#111;'> 1.50MB </span>","children":null,"spread":false},{"title":"Making Pre-trained Language Models Better Few-shot Learners.pdf <span style='color:#111;'> 1.37MB </span>","children":null,"spread":false},{"title":"GPT Understands, Too.pdf <span style='color:#111;'> 1.51MB </span>","children":null,"spread":false},{"title":"It’s Not Just Size That Matters.pdf <span style='color:#111;'> 457.39KB </span>","children":null,"spread":false},{"title":"WARP Word-level Adversarial ReProgramming.pdf <span style='color:#111;'> 1.59MB </span>","children":null,"spread":false},{"title":"PTR Prompt Tuning with Rules for Text Classification.pdf <span style='color:#111;'> 785.12KB </span>","children":null,"spread":false},{"title":"Learning How to Ask Querying LMs with Mixtures of Soft Prompts.pdf <span style='color:#111;'> 410.03KB </span>","children":null,"spread":false},{"title":"Automatically Identifying Words That Can Serve as Labels for Few-Shot.pdf <span style='color:#111;'> 268.63KB </span>","children":null,"spread":false}],"spread":false}],"spread":true}],"spread":true}]

评论信息

免责申明

【只为小站】的资源来自网友分享,仅供学习研究,请务必在下载后24小时内给予删除,不得用于其他任何用途,否则后果自负。基于互联网的特殊性,【只为小站】 无法对用户传输的作品、信息、内容的权属或合法性、合规性、真实性、科学性、完整权、有效性等进行实质审查;无论 【只为小站】 经营者是否已进行审查,用户均应自行承担因其传输的作品、信息、内容而可能或已经产生的侵权或权属纠纷等法律责任。
本站所有资源不代表本站的观点或立场,基于网友分享,根据中国法律《信息网络传播权保护条例》第二十二条之规定,若资源存在侵权或相关问题请联系本站客服人员,zhiweidada#qq.com,请把#换成@,本站将给予最大的支持与配合,做到及时反馈和处理。关于更多版权及免责申明参见 版权及免责申明