LIP-JPPNet-TensorFlow:TensorFlow中的JPPNet实施以供人解析-源码

上传者: 42143161 | 上传时间: 2021-02-19 20:25:43 | 文件大小: 2.58MB | 文件类型: ZIP
免责声明 这是的经过修改的存储库。 请参阅原始存储库以获取更多详细信息。 联合身体分析和姿势估计网络(JPPNet) 梁晓丹,龚科,沉和林亮,“观察人:联合的身体分析和姿势估计网络和一个新的基准”,T- 介绍 JPPNet是人类解析和姿态估计建立在之上的国家的艺术深度学习方法 。 这个新颖的联合人类解析和姿态估计网络在端到端框架中结合了多尺度特征连接和迭代位置细化,以研究有效的上下文建模,然后实现彼此互利的解析和姿态任务。 这个统一的框架为人类分析和姿势估计任务实现了最先进的性能。 此发行版为T-PAMI 2018接受的中报告的关键模型成分提供了一个公开可用的实现。我们通过探索一种新颖的

文件下载

资源详情

[{"title":"( 47 个子文件 2.58MB ) LIP-JPPNet-TensorFlow:TensorFlow中的JPPNet实施以供人解析-源码","children":[{"title":"LIP-JPPNet-TensorFlow-master","children":[{"title":"kaffe","children":[{"title":"errors.py <span style='color:#111;'> 111B </span>","children":null,"spread":false},{"title":"tensorflow","children":[{"title":"__init__.py <span style='color:#111;'> 76B </span>","children":null,"spread":false},{"title":"transformer.py <span style='color:#111;'> 10.16KB </span>","children":null,"spread":false},{"title":"network.py <span style='color:#111;'> 10.83KB </span>","children":null,"spread":false}],"spread":true},{"title":"caffe","children":[{"title":"resolver.py <span style='color:#111;'> 1.40KB </span>","children":null,"spread":false},{"title":"__init__.py <span style='color:#111;'> 54B </span>","children":null,"spread":false},{"title":"caffe_pb2.py <span style='color:#111;'> 253.16KB </span>","children":null,"spread":false}],"spread":true},{"title":"graph.py <span style='color:#111;'> 11.41KB </span>","children":null,"spread":false},{"title":"transformers.py <span style='color:#111;'> 10.66KB </span>","children":null,"spread":false},{"title":"__init__.py <span style='color:#111;'> 115B </span>","children":null,"spread":false},{"title":"layers.py <span style='color:#111;'> 4.81KB </span>","children":null,"spread":false},{"title":"shapes.py <span style='color:#111;'> 2.75KB </span>","children":null,"spread":false}],"spread":true},{"title":".gitignore <span style='color:#111;'> 1.23KB </span>","children":null,"spread":false},{"title":"evaluate_msc_with_CRF.py <span style='color:#111;'> 8.30KB </span>","children":null,"spread":false},{"title":"README.md <span style='color:#111;'> 4.13KB </span>","children":null,"spread":false},{"title":"utils","children":[{"title":"lip_reader.py <span style='color:#111;'> 11.22KB </span>","children":null,"spread":false},{"title":"image_reader_CRF.py <span style='color:#111;'> 6.11KB </span>","children":null,"spread":false},{"title":"ops.py <span style='color:#111;'> 2.20KB </span>","children":null,"spread":false},{"title":"read_LIP_data.py <span style='color:#111;'> 1.51KB </span>","children":null,"spread":false},{"title":"ssl_reader.py <span style='color:#111;'> 6.28KB </span>","children":null,"spread":false},{"title":"utils.py <span style='color:#111;'> 5.86KB </span>","children":null,"spread":false},{"title":"model.py <span style='color:#111;'> 25.74KB </span>","children":null,"spread":false},{"title":"pose_structure.py <span style='color:#111;'> 6.50KB </span>","children":null,"spread":false},{"title":"__init__.py <span style='color:#111;'> 255B </span>","children":null,"spread":false},{"title":"BatchDatsetReader.py <span style='color:#111;'> 3.77KB </span>","children":null,"spread":false},{"title":"image_reader.py <span style='color:#111;'> 6.04KB </span>","children":null,"spread":false}],"spread":false},{"title":"EvalMetrics.py <span style='color:#111;'> 15.87KB </span>","children":null,"spread":false},{"title":"LICENSE <span style='color:#111;'> 1.05KB </span>","children":null,"spread":false},{"title":"datasets","children":[{"title":"examples","children":[{"title":"list","children":[{"title":"val.txt <span style='color:#111;'> 84B </span>","children":null,"spread":false}],"spread":true},{"title":"images","children":[{"title":"76680_475011.jpg <span style='color:#111;'> 30.47KB </span>","children":null,"spread":false},{"title":"114317_456748.jpg <span style='color:#111;'> 9.16KB </span>","children":null,"spread":false}],"spread":true}],"spread":true},{"title":"lip","children":[{"title":"README.md <span style='color:#111;'> 402B </span>","children":null,"spread":false},{"title":"list","children":[{"title":"val_id.txt <span style='color:#111;'> 138.30KB </span>","children":null,"spread":false},{"title":"train_id.txt <span style='color:#111;'> 422.02KB </span>","children":null,"spread":false},{"title":"train_rev.txt <span style='color:#111;'> 2.40MB </span>","children":null,"spread":false}],"spread":true},{"title":"vis_annotation.py <span style='color:#111;'> 2.26KB </span>","children":null,"spread":false},{"title":"lip_train_set.csv <span style='color:#111;'> 5.23MB </span>","children":null,"spread":false},{"title":"lip_val_set.csv <span style='color:#111;'> 1.59MB </span>","children":null,"spread":false},{"title":"create_heatmaps.py <span style='color:#111;'> 1.85KB </span>","children":null,"spread":false}],"spread":true}],"spread":true},{"title":"bfscore.py <span style='color:#111;'> 6.38KB </span>","children":null,"spread":false},{"title":"denseCRF.py <span style='color:#111;'> 9.75KB </span>","children":null,"spread":false},{"title":"train_SS-JPPNet.py <span style='color:#111;'> 23.43KB </span>","children":null,"spread":false},{"title":"evaluate_parsing_JPPNet-s2.py <span style='color:#111;'> 7.87KB </span>","children":null,"spread":false},{"title":"train_JPPNet-s2.py <span style='color:#111;'> 23.43KB </span>","children":null,"spread":false},{"title":"LIP_model.py <span style='color:#111;'> 3.90KB </span>","children":null,"spread":false},{"title":"test_human.py <span style='color:#111;'> 3.18KB </span>","children":null,"spread":false},{"title":"evaluate_pose_JPPNet-s2.py <span style='color:#111;'> 7.29KB </span>","children":null,"spread":false}],"spread":false}],"spread":true}]

评论信息

免责申明

【只为小站】的资源来自网友分享,仅供学习研究,请务必在下载后24小时内给予删除,不得用于其他任何用途,否则后果自负。基于互联网的特殊性,【只为小站】 无法对用户传输的作品、信息、内容的权属或合法性、合规性、真实性、科学性、完整权、有效性等进行实质审查;无论 【只为小站】 经营者是否已进行审查,用户均应自行承担因其传输的作品、信息、内容而可能或已经产生的侵权或权属纠纷等法律责任。
本站所有资源不代表本站的观点或立场,基于网友分享,根据中国法律《信息网络传播权保护条例》第二十二条之规定,若资源存在侵权或相关问题请联系本站客服人员,zhiweidada#qq.com,请把#换成@,本站将给予最大的支持与配合,做到及时反馈和处理。关于更多版权及免责申明参见 版权及免责申明