face-detector-源码

上传者: 42144554 | 上传时间: 2021-03-22 14:07:31 | 文件大小: 1.46MB | 文件类型: ZIP
我们的工作已经发表在“模式识别”信函中。 发言题目:地标制导的独立时空通道注意和基于互补上下文信息的面部表情识别 我们建议的FER框架: 使用GRADCAM的FEDRO图像的激活图如下所示: 下面显示了我们从FERPLUS数据集中合成的蒙版人脸图像的示例: 下面显示了在Baseline3,RAN和FERPlus上我们的蒙版人脸图像模型之间使用GRADCAM绘制的激活图的比较: 注意:我们的训练有素的模型检查点将在发布后提供。 引文:@article {gera2020landmark,标题= {Landmark Guidance独立的时空通道注意力和基于互补上下文信息的面部表情识别},作者= {Gera,Darshan和Balasubramanian,S},journal = {arXiv预印本arXiv:2007.10298},年份= {2020}} 致谢:我们将

文件下载

资源详情

[{"title":"( 60 个子文件 1.46MB ) face-detector-源码","children":[{"title":"face-detector-main","children":[{"title":"GACNN","children":[{"title":"sfew_dataset.py <span style='color:#111;'> 8.14KB </span>","children":null,"spread":false},{"title":"main_fplus.py <span style='color:#111;'> 13.74KB </span>","children":null,"spread":false},{"title":"main_sfew.py <span style='color:#111;'> 13.74KB </span>","children":null,"spread":false},{"title":"main_affectnet.py <span style='color:#111;'> 14.16KB </span>","children":null,"spread":false},{"title":"Readme.txt <span style='color:#111;'> 844B </span>","children":null,"spread":false},{"title":"main_jaffe.py <span style='color:#111;'> 9.63KB </span>","children":null,"spread":false},{"title":"ferplus_dataset.py <span style='color:#111;'> 12.77KB </span>","children":null,"spread":false},{"title":"affectnet_dataset.py <span style='color:#111;'> 11.96KB </span>","children":null,"spread":false},{"title":"logs","children":[{"title":"affectnet8_gacnn.txt <span style='color:#111;'> 68.29KB </span>","children":null,"spread":false},{"title":"cross_evaluation_jaffe.txt <span style='color:#111;'> 1.32KB </span>","children":null,"spread":false},{"title":"sfew_gacnn.txt <span style='color:#111;'> 64.76KB </span>","children":null,"spread":false},{"title":"fplus_gacnn.txt <span style='color:#111;'> 65.02KB </span>","children":null,"spread":false}],"spread":true},{"title":"jaffe_dataset.py <span style='color:#111;'> 8.72KB </span>","children":null,"spread":false},{"title":"model.py <span style='color:#111;'> 5.59KB </span>","children":null,"spread":false}],"spread":false},{"title":"main_rafdb.py <span style='color:#111;'> 17.24KB </span>","children":null,"spread":false},{"title":"images","children":[{"title":"SCAN.png <span style='color:#111;'> 11.91KB </span>","children":null,"spread":false},{"title":"graphicalabstract03.png <span style='color:#111;'> 212.55KB </span>","children":null,"spread":false},{"title":"proposed_framework.png <span style='color:#111;'> 1B </span>","children":null,"spread":false},{"title":"masked_activation_baseline_ran_ours.png <span style='color:#111;'> 326.78KB </span>","children":null,"spread":false},{"title":"figure_masked_faces_07.png <span style='color:#111;'> 128.05KB </span>","children":null,"spread":false},{"title":"figure_grad_cam_06.png <span style='color:#111;'> 595.94KB </span>","children":null,"spread":false}],"spread":true},{"title":"pretrainedmodels","children":[{"title":"requirement.txt <span style='color:#111;'> 124B </span>","children":null,"spread":false}],"spread":true},{"title":"dataset","children":[{"title":"sfew_dataset.py <span style='color:#111;'> 2.58KB </span>","children":null,"spread":false},{"title":"sampler.py <span style='color:#111;'> 2.60KB </span>","children":null,"spread":false},{"title":"oulucasia_dataset_cv.py <span style='color:#111;'> 4.67KB </span>","children":null,"spread":false},{"title":"affectnet_rafdb_dataset.py <span style='color:#111;'> 8.58KB </span>","children":null,"spread":false},{"title":"ckplus_dataset_cv.py <span style='color:#111;'> 5.23KB </span>","children":null,"spread":false},{"title":"fedro_dataset.py <span style='color:#111;'> 2.82KB </span>","children":null,"spread":false},{"title":"ferplus_dataset.py <span style='color:#111;'> 9.20KB </span>","children":null,"spread":false},{"title":"affectnet_dataset.py <span style='color:#111;'> 7.01KB </span>","children":null,"spread":false},{"title":"rafdb_dataset.py <span style='color:#111;'> 6.03KB </span>","children":null,"spread":false}],"spread":true},{"title":"models","children":[{"title":"attentionnet.py <span style='color:#111;'> 12.21KB </span>","children":null,"spread":false},{"title":"resnet.py <span style='color:#111;'> 7.13KB </span>","children":null,"spread":false}],"spread":true},{"title":"main_ckplus.py <span style='color:#111;'> 18.63KB </span>","children":null,"spread":false},{"title":"utils","children":[{"title":"util.py <span style='color:#111;'> 1.42KB </span>","children":null,"spread":false}],"spread":true},{"title":"main_oulucasia.py <span style='color:#111;'> 18.34KB </span>","children":null,"spread":false},{"title":"README.md <span style='color:#111;'> 1.62KB </span>","children":null,"spread":false},{"title":"main_sfew.py <span style='color:#111;'> 16.66KB </span>","children":null,"spread":false},{"title":"main_affectnet.py <span style='color:#111;'> 16.81KB </span>","children":null,"spread":false},{"title":"Readme.txt <span style='color:#111;'> 543B </span>","children":null,"spread":false},{"title":"main_affectnet_rafdb_test_fedro.py <span style='color:#111;'> 17.50KB </span>","children":null,"spread":false},{"title":"main_ferplus.py <span style='color:#111;'> 16.97KB </span>","children":null,"spread":false},{"title":"OADN","children":[{"title":"train_sfew.py <span style='color:#111;'> 16.52KB </span>","children":null,"spread":false},{"title":"train_ferplus.py <span style='color:#111;'> 17.53KB </span>","children":null,"spread":false},{"title":"dataset","children":[{"title":"sfew_dataset_attentionmaps.py <span style='color:#111;'> 12.82KB </span>","children":null,"spread":false},{"title":"sampler.py <span style='color:#111;'> 1.87KB </span>","children":null,"spread":false},{"title":"affectnet_dataset_attentionmaps.py <span style='color:#111;'> 17.99KB </span>","children":null,"spread":false},{"title":"ferplus_dataset_attentionmaps.py <span style='color:#111;'> 19.06KB </span>","children":null,"spread":false},{"title":"rafdb_dataset_attentionmaps.py <span style='color:#111;'> 13.47KB </span>","children":null,"spread":false}],"spread":false},{"title":"train_affectnet.py <span style='color:#111;'> 19.85KB </span>","children":null,"spread":false},{"title":"models","children":[{"title":"attentionnet.py <span style='color:#111;'> 3.60KB </span>","children":null,"spread":false},{"title":"resnet.py <span style='color:#111;'> 7.63KB </span>","children":null,"spread":false}],"spread":false},{"title":"util.py <span style='color:#111;'> 2.35KB </span>","children":null,"spread":false},{"title":"Readme.txt <span style='color:#111;'> 1.01KB </span>","children":null,"spread":false},{"title":"train_rafdb.py <span style='color:#111;'> 17.80KB </span>","children":null,"spread":false},{"title":"logs","children":[{"title":"sfew_oadn_affectnet.txt <span style='color:#111;'> 23.67KB </span>","children":null,"spread":false},{"title":"rafdb_occlusion_pose_results_oadn.txt <span style='color:#111;'> 4.32KB </span>","children":null,"spread":false},{"title":"affectnet8_pose_occlusion_validation_results.txt <span style='color:#111;'> 7.31KB </span>","children":null,"spread":false},{"title":"fplus_oadn.txt <span style='color:#111;'> 41.03KB </span>","children":null,"spread":false},{"title":"affectnet8_oadn.txt <span style='color:#111;'> 57.99KB </span>","children":null,"spread":false}],"spread":false}],"spread":false}],"spread":false}],"spread":true}]

评论信息

免责申明

【只为小站】的资源来自网友分享,仅供学习研究,请务必在下载后24小时内给予删除,不得用于其他任何用途,否则后果自负。基于互联网的特殊性,【只为小站】 无法对用户传输的作品、信息、内容的权属或合法性、合规性、真实性、科学性、完整权、有效性等进行实质审查;无论 【只为小站】 经营者是否已进行审查,用户均应自行承担因其传输的作品、信息、内容而可能或已经产生的侵权或权属纠纷等法律责任。
本站所有资源不代表本站的观点或立场,基于网友分享,根据中国法律《信息网络传播权保护条例》第二十二条之规定,若资源存在侵权或相关问题请联系本站客服人员,zhiweidada#qq.com,请把#换成@,本站将给予最大的支持与配合,做到及时反馈和处理。关于更多版权及免责申明参见 版权及免责申明