self-attention-cv:专注于计算机视觉的各种自我关注机制的实现。 进行中的资料库-源码

上传者: 42116805 | 上传时间: 2021-08-28 09:58:39 | 文件大小: 132KB | 文件类型: ZIP
PyTorch中计算机视觉应用程序的自注意力构建基块 使用einsum和einops在PyTorch中实现计算机视觉的自我关注机制。 专注于计算机视觉自我注意模块。 通过pip安装 $ pip install self-attention-cv 如果您没有GPU,最好在您的环境中预安装pytorch。 相关文章 程式码范例 多头注意力 import torch from self_attention_cv import MultiHeadSelfAttention model = MultiHeadSelfAttention ( dim = 64 ) x = torch . rand ( 16 , 10 , 64 ) # [batch, tokens, dim] mask = torch . zeros ( 10 , 10 ) # tokens X tokens mask [ 5 :

文件下载

资源详情

[{"title":"( 71 个子文件 132KB ) self-attention-cv:专注于计算机视觉的各种自我关注机制的实现。 进行中的资料库-源码","children":[{"title":"self-attention-cv-main","children":[{"title":".github","children":[{"title":"FUNDING.yml <span style='color:#111;'> 679B </span>","children":null,"spread":false},{"title":"workflows","children":[{"title":"publish.yml <span style='color:#111;'> 618B </span>","children":null,"spread":false}],"spread":true}],"spread":true},{"title":"requirements.txt <span style='color:#111;'> 64B </span>","children":null,"spread":false},{"title":"examples","children":[{"title":"vanilla_transformer.py <span style='color:#111;'> 312B </span>","children":null,"spread":false},{"title":"pos_emb_1d.py <span style='color:#111;'> 597B </span>","children":null,"spread":false},{"title":"vit_minimal.py <span style='color:#111;'> 431B </span>","children":null,"spread":false},{"title":"seg3d.py <span style='color:#111;'> 418B </span>","children":null,"spread":false},{"title":"TransUnet.py <span style='color:#111;'> 247B </span>","children":null,"spread":false},{"title":"rel_pos_emb2D.py <span style='color:#111;'> 380B </span>","children":null,"spread":false},{"title":"mhsa.py <span style='color:#111;'> 311B </span>","children":null,"spread":false},{"title":"axial_att_block.py <span style='color:#111;'> 507B </span>","children":null,"spread":false},{"title":"vanilla_transformer_block.py <span style='color:#111;'> 295B </span>","children":null,"spread":false},{"title":"bot_attention.py <span style='color:#111;'> 550B </span>","children":null,"spread":false},{"title":"MSA.py <span style='color:#111;'> 399B </span>","children":null,"spread":false},{"title":"timesformer.py <span style='color:#111;'> 670B </span>","children":null,"spread":false},{"title":"linformer.py <span style='color:#111;'> 316B </span>","children":null,"spread":false},{"title":"scaled_dot_prod_att.py <span style='color:#111;'> 209B </span>","children":null,"spread":false},{"title":"resnet50vit_minimal.py <span style='color:#111;'> 300B </span>","children":null,"spread":false}],"spread":false},{"title":"LICENSE <span style='color:#111;'> 1.04KB </span>","children":null,"spread":false},{"title":"feat_img.png <span style='color:#111;'> 80.88KB </span>","children":null,"spread":false},{"title":"setup.py <span style='color:#111;'> 2.81KB </span>","children":null,"spread":false},{"title":"README.md <span style='color:#111;'> 5.69KB </span>","children":null,"spread":false},{"title":"self_attention_cv","children":[{"title":"MSA_transformer","children":[{"title":"tied_axial_attention.py <span style='color:#111;'> 2.57KB </span>","children":null,"spread":false},{"title":"MSA_transformer_block.py <span style='color:#111;'> 2.66KB </span>","children":null,"spread":false},{"title":"__init__.py <span style='color:#111;'> 101B </span>","children":null,"spread":false}],"spread":true},{"title":"axial_attention_deeplab","children":[{"title":"axial_attention.py <span style='color:#111;'> 3.66KB </span>","children":null,"spread":false},{"title":"__init__.py <span style='color:#111;'> 108B </span>","children":null,"spread":false},{"title":"axial_attention_residual_block.py <span style='color:#111;'> 2.41KB </span>","children":null,"spread":false}],"spread":true},{"title":"transunet","children":[{"title":"__init__.py <span style='color:#111;'> 34B </span>","children":null,"spread":false},{"title":"trans_unet.py <span style='color:#111;'> 3.11KB </span>","children":null,"spread":false},{"title":"bottleneck_layer.py <span style='color:#111;'> 2.19KB </span>","children":null,"spread":false},{"title":"decoder.py <span style='color:#111;'> 1.59KB </span>","children":null,"spread":false}],"spread":true},{"title":"pos_embeddings","children":[{"title":"relative_embeddings_2D.py <span style='color:#111;'> 1.92KB </span>","children":null,"spread":false},{"title":"__init__.py <span style='color:#111;'> 205B </span>","children":null,"spread":false},{"title":"relative_embeddings_1D.py <span style='color:#111;'> 2.39KB </span>","children":null,"spread":false},{"title":"abs_pos_emb1D.py <span style='color:#111;'> 743B </span>","children":null,"spread":false},{"title":"relative_pos_enc_qkv.py <span style='color:#111;'> 1.81KB </span>","children":null,"spread":false},{"title":"pos_encoding_sin.py <span style='color:#111;'> 940B </span>","children":null,"spread":false}],"spread":true},{"title":"Transformer3Dsegmentation","children":[{"title":"tranf3Dseg.py <span style='color:#111;'> 3.82KB </span>","children":null,"spread":false},{"title":"__init__.py <span style='color:#111;'> 41B </span>","children":null,"spread":false}],"spread":true},{"title":"linformer","children":[{"title":"__init__.py <span style='color:#111;'> 81B </span>","children":null,"spread":false},{"title":"LinformerBlock.py <span style='color:#111;'> 3.47KB </span>","children":null,"spread":false},{"title":"linformer.py <span style='color:#111;'> 2.45KB </span>","children":null,"spread":false}],"spread":true},{"title":"__init__.py <span style='color:#111;'> 378B </span>","children":null,"spread":false},{"title":"transformer_vanilla","children":[{"title":"self_attention.py <span style='color:#111;'> 1.41KB </span>","children":null,"spread":false},{"title":"mhsa.py <span style='color:#111;'> 2.04KB </span>","children":null,"spread":false},{"title":"__init__.py <span style='color:#111;'> 151B </span>","children":null,"spread":false},{"title":"transformer_block.py <span style='color:#111;'> 1.83KB </span>","children":null,"spread":false}],"spread":false},{"title":"timesformer","children":[{"title":"spacetime_attention.py <span style='color:#111;'> 4.70KB </span>","children":null,"spread":false},{"title":"__init__.py <span style='color:#111;'> 84B </span>","children":null,"spread":false},{"title":"timesformer.py <span style='color:#111;'> 5.77KB </span>","children":null,"spread":false}],"spread":false},{"title":"vit","children":[{"title":"R50_ViT.py <span style='color:#111;'> 2.25KB </span>","children":null,"spread":false},{"title":"__init__.py <span style='color:#111;'> 54B </span>","children":null,"spread":false},{"title":"vit.py <span style='color:#111;'> 3.63KB </span>","children":null,"spread":false}],"spread":false},{"title":"bottleneck_transformer","children":[{"title":"bot_block.py <span style='color:#111;'> 4.80KB </span>","children":null,"spread":false},{"title":"__init__.py <span style='color:#111;'> 98B </span>","children":null,"spread":false},{"title":"bot_att.py <span style='color:#111;'> 2.37KB </span>","children":null,"spread":false}],"spread":false},{"title":"version.py <span style='color:#111;'> 165B </span>","children":null,"spread":false},{"title":"common.py <span style='color:#111;'> 526B </span>","children":null,"spread":false}],"spread":false},{"title":"tests","children":[{"title":"test_resnet50vit_minimal.py <span style='color:#111;'> 790B </span>","children":null,"spread":false},{"title":"test_axial_att_block.py <span style='color:#111;'> 309B </span>","children":null,"spread":false},{"title":"test_mhsa.py <span style='color:#111;'> 375B </span>","children":null,"spread":false},{"title":"test_MSA.py <span style='color:#111;'> 460B </span>","children":null,"spread":false},{"title":"test_timesformer.py <span style='color:#111;'> 675B </span>","children":null,"spread":false},{"title":"test_bot_attention.py <span style='color:#111;'> 607B </span>","children":null,"spread":false},{"title":"test_vanilla_transformer_block.py <span style='color:#111;'> 656B </span>","children":null,"spread":false},{"title":"test_linformer.py <span style='color:#111;'> 367B </span>","children":null,"spread":false},{"title":"test_TransUnet.py <span style='color:#111;'> 285B </span>","children":null,"spread":false},{"title":"test_pos_emb.py <span style='color:#111;'> 783B </span>","children":null,"spread":false}],"spread":true},{"title":".gitignore <span style='color:#111;'> 175B </span>","children":null,"spread":false},{"title":"pyproject.toml <span style='color:#111;'> 103B </span>","children":null,"spread":false}],"spread":false}],"spread":true}]

评论信息

免责申明

【只为小站】的资源来自网友分享,仅供学习研究,请务必在下载后24小时内给予删除,不得用于其他任何用途,否则后果自负。基于互联网的特殊性,【只为小站】 无法对用户传输的作品、信息、内容的权属或合法性、合规性、真实性、科学性、完整权、有效性等进行实质审查;无论 【只为小站】 经营者是否已进行审查,用户均应自行承担因其传输的作品、信息、内容而可能或已经产生的侵权或权属纠纷等法律责任。
本站所有资源不代表本站的观点或立场,基于网友分享,根据中国法律《信息网络传播权保护条例》第二十二条之规定,若资源存在侵权或相关问题请联系本站客服人员,zhiweidada#qq.com,请把#换成@,本站将给予最大的支持与配合,做到及时反馈和处理。关于更多版权及免责申明参见 版权及免责申明