GAST-Net-3DPoseEstimation:用于视频中3D人姿估计的图注意力时空卷积网络(GAST-Net)

上传者: 42112658 | 上传时间: 2024-02-02 19:46:42 | 文件大小: 39.9MB | 文件类型: ZIP
用于视频中3D人姿估计的图注意力时空卷积网络(GAST-Net) 消息 [2021/01/28]我们更新了GAST-Net,使其能够生成包括关节和脚关节在内的19个关节的人体姿势。 [2020/11/17]我们提供了有关如何从自定义视频生成3D姿势/动画的教程。 [2020/10/15]我们使用单个RGB相机实现了基于在线3D骨架的动作识别。 [2020/08/14]我们实现了实时3D姿态估计。 介绍 时空信息对于解决3D姿态估计中的遮挡和深度模糊性至关重要。 先前的方法集中于嵌入固定长度的时空信息的时间上下文或局部到全局体系结构。 迄今为止,还没有有效的建议来同时灵活地捕获变化的时空序列并有效地实现实时3D姿态估计。 在这项工作中,我们通过注意机制对局部和全局空间信息进行建模,从而改善了人体骨骼运动学约束的学习:姿势,局部运动学连接和对称性。 为了适应单帧和多帧估计,采用了扩张

文件下载

资源详情

[{"title":"( 103 个子文件 39.9MB ) GAST-Net-3DPoseEstimation:用于视频中3D人姿估计的图注意力时空卷积网络(GAST-Net)","children":[{"title":"apart.avi <span style='color:#111;'> 3.56MB </span>","children":null,"spread":false},{"title":"yolov3.cfg <span style='color:#111;'> 8.15KB </span>","children":null,"spread":false},{"title":"yolo-voc.cfg <span style='color:#111;'> 2.67KB </span>","children":null,"spread":false},{"title":"yolo.cfg <span style='color:#111;'> 2.66KB </span>","children":null,"spread":false},{"title":"tiny-yolo-voc.cfg <span style='color:#111;'> 1.38KB </span>","children":null,"spread":false},{"title":"Baseball.gif <span style='color:#111;'> 9.49MB </span>","children":null,"spread":false},{"title":"Baseball_body_foot.gif <span style='color:#111;'> 8.35MB </span>","children":null,"spread":false},{"title":"WalkApart.gif <span style='color:#111;'> 1.81MB </span>","children":null,"spread":false},{"title":"WalkTowards.gif <span style='color:#111;'> 1.58MB </span>","children":null,"spread":false},{"title":"FallingDown.gif <span style='color:#111;'> 1.40MB </span>","children":null,"spread":false},{"title":".gitignore <span style='color:#111;'> 2.29KB </span>","children":null,"spread":false},{"title":"GAST-Net.iml <span style='color:#111;'> 1.19KB </span>","children":null,"spread":false},{"title":"baseball_wholebody.json <span style='color:#111;'> 1.04MB </span>","children":null,"spread":false},{"title":"baseball.json <span style='color:#111;'> 160.07KB </span>","children":null,"spread":false},{"title":"LICENSE <span style='color:#111;'> 1.13KB </span>","children":null,"spread":false},{"title":"Makefile <span style='color:#111;'> 116B </span>","children":null,"spread":false},{"title":"README.md <span style='color:#111;'> 6.70KB </span>","children":null,"spread":false},{"title":"INFERENCE_EN.md <span style='color:#111;'> 2.62KB </span>","children":null,"spread":false},{"title":"INFERENCE_CH.md <span style='color:#111;'> 2.43KB </span>","children":null,"spread":false},{"title":"baseball.mp4 <span style='color:#111;'> 5.44MB </span>","children":null,"spread":false},{"title":"baseball.mp4 <span style='color:#111;'> 3.96MB </span>","children":null,"spread":false},{"title":"coco.names <span style='color:#111;'> 625B </span>","children":null,"spread":false},{"title":"voc.names <span style='color:#111;'> 135B </span>","children":null,"spread":false},{"title":"pallete <span style='color:#111;'> 908B </span>","children":null,"spread":false},{"title":"pose_estimation.png <span style='color:#111;'> 1.24MB </span>","children":null,"spread":false},{"title":"detection_tracking.png <span style='color:#111;'> 1.22MB </span>","children":null,"spread":false},{"title":"input.png <span style='color:#111;'> 1.19MB </span>","children":null,"spread":false},{"title":"framework.png <span style='color:#111;'> 335.15KB </span>","children":null,"spread":false},{"title":"reconstruction.png <span style='color:#111;'> 173.54KB </span>","children":null,"spread":false},{"title":"pose_hrnet.py <span style='color:#111;'> 17.71KB </span>","children":null,"spread":false},{"title":"main.py <span style='color:#111;'> 14.76KB </span>","children":null,"spread":false},{"title":"darknet.py <span style='color:#111;'> 13.87KB </span>","children":null,"spread":false},{"title":"trainval.py <span style='color:#111;'> 13.66KB </span>","children":null,"spread":false},{"title":"h36m_dataset.py <span style='color:#111;'> 12.34KB </span>","children":null,"spread":false},{"title":"gen_kpts.py <span style='color:#111;'> 12.26KB </span>","children":null,"spread":false},{"title":"gast_net.py <span style='color:#111;'> 11.14KB </span>","children":null,"spread":false},{"title":"generators.py <span style='color:#111;'> 10.83KB </span>","children":null,"spread":false},{"title":"reconstruction.py <span style='color:#111;'> 10.43KB </span>","children":null,"spread":false},{"title":"vis_h36m.py <span style='color:#111;'> 9.59KB </span>","children":null,"spread":false},{"title":"pose_resnet.py <span style='color:#111;'> 9.24KB </span>","children":null,"spread":false},{"title":"prepare_data_humaneva.py <span style='color:#111;'> 9.12KB </span>","children":null,"spread":false},{"title":"visualization.py <span style='color:#111;'> 8.28KB </span>","children":null,"spread":false},{"title":"util.py <span style='color:#111;'> 7.97KB </span>","children":null,"spread":false},{"title":"sort.py <span style='color:#111;'> 7.84KB </span>","children":null,"spread":false},{"title":"gen_skes.py <span style='color:#111;'> 6.64KB </span>","children":null,"spread":false},{"title":"preprocess.py <span style='color:#111;'> 6.46KB </span>","children":null,"spread":false},{"title":"utils.py <span style='color:#111;'> 6.06KB </span>","children":null,"spread":false},{"title":"arguments.py <span style='color:#111;'> 5.69KB </span>","children":null,"spread":false},{"title":"utilitys.py <span style='color:#111;'> 5.63KB </span>","children":null,"spread":false},{"title":"global_attention.py <span style='color:#111;'> 5.44KB </span>","children":null,"spread":false},{"title":"prepare_data_h36m.py <span style='color:#111;'> 5.21KB </span>","children":null,"spread":false},{"title":"human_detector.py <span style='color:#111;'> 5.07KB </span>","children":null,"spread":false},{"title":"local_attention.py <span style='color:#111;'> 5.04KB </span>","children":null,"spread":false},{"title":"sem_graph_conv.py <span style='color:#111;'> 4.85KB </span>","children":null,"spread":false},{"title":"inference.py <span style='color:#111;'> 4.28KB </span>","children":null,"spread":false},{"title":"humaneva_dataset.py <span style='color:#111;'> 4.23KB </span>","children":null,"spread":false},{"title":"transforms.py <span style='color:#111;'> 3.74KB </span>","children":null,"spread":false},{"title":"prepare_data_2d_h36m_sh.py <span style='color:#111;'> 3.44KB </span>","children":null,"spread":false},{"title":"default.py <span style='color:#111;'> 3.44KB </span>","children":null,"spread":false},{"title":"bbox.py <span style='color:#111;'> 3.27KB </span>","children":null,"spread":false},{"title":"mpii_coco_h36m.py <span style='color:#111;'> 3.23KB </span>","children":null,"spread":false},{"title":"loss.py <span style='color:#111;'> 2.84KB </span>","children":null,"spread":false},{"title":"inference.py <span style='color:#111;'> 2.65KB </span>","children":null,"spread":false},{"title":"prepare_data_2d_h36m_generic.py <span style='color:#111;'> 2.57KB </span>","children":null,"spread":false},{"title":"skeleton.py <span style='color:#111;'> 2.50KB </span>","children":null,"spread":false},{"title":"data_utils.py <span style='color:#111;'> 2.46KB </span>","children":null,"spread":false},{"title":"coco_h36m.py <span style='color:#111;'> 2.18KB </span>","children":null,"spread":false},{"title":"models.py <span style='color:#111;'> 2.07KB </span>","children":null,"spread":false},{"title":"color_edge.py <span style='color:#111;'> 1.90KB </span>","children":null,"spread":false},{"title":"camera.py <span style='color:#111;'> 1.79KB </span>","children":null,"spread":false},{"title":"preprocess.py <span style='color:#111;'> 1.72KB </span>","children":null,"spread":false},{"title":"vis_kpts.py <span style='color:#111;'> 1.58KB </span>","children":null,"spread":false},{"title":"graph_utils.py <span style='color:#111;'> 1.56KB </span>","children":null,"spread":false},{"title":"quaternion.py <span style='color:#111;'> 1.21KB </span>","children":null,"spread":false},{"title":"_init_paths.py <span style='color:#111;'> 1.07KB </span>","children":null,"spread":false},{"title":"mocap_dataset.py <span style='color:#111;'> 864B </span>","children":null,"spread":false},{"title":"__init__.py <span style='color:#111;'> 541B </span>","children":null,"spread":false},{"title":"__init__.py <span style='color:#111;'> 398B </span>","children":null,"spread":false},{"title":"__init__.py <span style='color:#111;'> 369B </span>","children":null,"spread":false},{"title":"__init__.py <span style='color:#111;'> 179B </span>","children":null,"spread":false},{"title":"__init__.py <span style='color:#111;'> 0B </span>","children":null,"spread":false},{"title":"__init__.py <span style='color:#111;'> 0B </span>","children":null,"spread":false},{"title":"__init__.py <span style='color:#111;'> 0B </span>","children":null,"spread":false},{"title":"workspace.xml <span style='color:#111;'> 48.94KB </span>","children":null,"spread":false},{"title":"misc.xml <span style='color:#111;'> 298B </span>","children":null,"spread":false},{"title":"modules.xml <span style='color:#111;'> 268B </span>","children":null,"spread":false},{"title":"vcs.xml <span style='color:#111;'> 180B </span>","children":null,"spread":false},{"title":"encodings.xml <span style='color:#111;'> 135B </span>","children":null,"spread":false},{"title":"w32_384x288_adam_lr1e-3.yaml <span style='color:#111;'> 2.12KB </span>","children":null,"spread":false},{"title":"w48_256x192_adam_lr1e-3.yaml <span style='color:#111;'> 2.12KB </span>","children":null,"spread":false},{"title":"w32_256x192_adam_lr1e-3.yaml <span style='color:#111;'> 2.12KB </span>","children":null,"spread":false},{"title":"w48_384x288_adam_lr1e-3.yaml <span style='color:#111;'> 2.12KB </span>","children":null,"spread":false},{"title":"w48_256x256_adam_lr1e-3.yaml <span style='color:#111;'> 1.90KB </span>","children":null,"spread":false},{"title":"w32_256x256_adam_lr1e-3.yaml <span style='color:#111;'> 1.90KB </span>","children":null,"spread":false},{"title":"res152_256x256_d256x3_adam_lr1e-3.yaml <span style='color:#111;'> 1.49KB </span>","children":null,"spread":false},{"title":"res101_256x256_d256x3_adam_lr1e-3.yaml <span style='color:#111;'> 1.49KB </span>","children":null,"spread":false},{"title":"res50_256x256_d256x3_adam_lr1e-3.yaml <span style='color:#111;'> 1.49KB </span>","children":null,"spread":false},{"title":"res101_384x288_d256x3_adam_lr1e-3.yaml <span style='color:#111;'> 1.43KB </span>","children":null,"spread":false},{"title":"res101_256x192_d256x3_adam_lr1e-3.yaml <span style='color:#111;'> 1.43KB </span>","children":null,"spread":false},{"title":"res152_384x288_d256x3_adam_lr1e-3.yaml <span style='color:#111;'> 1.43KB </span>","children":null,"spread":false},{"title":"......","children":null,"spread":false},{"title":"<span style='color:steelblue;'>文件过多,未全部展示</span>","children":null,"spread":false}],"spread":true}]

评论信息

免责申明

【只为小站】的资源来自网友分享,仅供学习研究,请务必在下载后24小时内给予删除,不得用于其他任何用途,否则后果自负。基于互联网的特殊性,【只为小站】 无法对用户传输的作品、信息、内容的权属或合法性、合规性、真实性、科学性、完整权、有效性等进行实质审查;无论 【只为小站】 经营者是否已进行审查,用户均应自行承担因其传输的作品、信息、内容而可能或已经产生的侵权或权属纠纷等法律责任。
本站所有资源不代表本站的观点或立场,基于网友分享,根据中国法律《信息网络传播权保护条例》第二十二条之规定,若资源存在侵权或相关问题请联系本站客服人员,zhiweidada#qq.com,请把#换成@,本站将给予最大的支持与配合,做到及时反馈和处理。关于更多版权及免责申明参见 版权及免责申明