xView2-Solution

上传者: 42104181 | 上传时间: 2023-03-07 21:36:22 | 文件大小: 3.79MB | 文件类型: ZIP
xView2损害评估挑战赛的第三名解决方案 尤金·赫维琴雅(Eugene Khvedchenya),2020年2月 该存储库包含用于解决源代码。 我的解决方案在公共LB上的得分为第二(0.803),在私人保留数据集上的得分为第三(0.805)。 简而言之 语义分割模型的集合。 经过加权CE培训,以解决班级不平衡问题。 进行大量增强,以防止过度拟合并增强对未对齐的前映像和后映像的鲁棒性。 前后图像共享编码器。 提取的特征将被串联并发送到解码器。 一堆编码器(ResNet,Densenet,EfficientNets)和两个解码器:Unet和FPN。 1轮伪标签 使用加权平均进行合奏。 在相应的验证数据上为每个模型优化的权重。 训练 从requirements.txt安装依赖项 关注train.sh 推理 要使用预训练的模型进行推断,请从“发布”选项卡下载完整档案,然后运行predi

文件下载

资源详情

[{"title":"( 70 个子文件 3.79MB ) xView2-Solution","children":[{"title":"xView2-Solution-master","children":[{"title":"run_tensorboard_3389.sh <span style='color:#111;'> 101B </span>","children":null,"spread":false},{"title":"predict_oof.py <span style='color:#111;'> 3.84KB </span>","children":null,"spread":false},{"title":"evaluate_postprocessing.py <span style='color:#111;'> 3.54KB </span>","children":null,"spread":false},{"title":"test_docker_pytorch14_37.cmd <span style='color:#111;'> 1.63KB </span>","children":null,"spread":false},{"title":"train_folds.csv <span style='color:#111;'> 3.51MB </span>","children":null,"spread":false},{"title":"fit_predict.py <span style='color:#111;'> 24.22KB </span>","children":null,"spread":false},{"title":"convert_masks.py <span style='color:#111;'> 2.66KB </span>","children":null,"spread":false},{"title":"predict_37_weighted.py <span style='color:#111;'> 9.01KB </span>","children":null,"spread":false},{"title":"optimize_softmax.py <span style='color:#111;'> 2.82KB </span>","children":null,"spread":false},{"title":"train.sh <span style='color:#111;'> 11.49KB </span>","children":null,"spread":false},{"title":"predict.py <span style='color:#111;'> 4.54KB </span>","children":null,"spread":false},{"title":"finetune.py <span style='color:#111;'> 20.28KB </span>","children":null,"spread":false},{"title":"tests","children":[{"title":"test_damage_00121_prediction.png <span style='color:#111;'> 9.96KB </span>","children":null,"spread":false},{"title":"test_localization_00121_prediction.png <span style='color:#111;'> 12.29KB </span>","children":null,"spread":false},{"title":"test_post_00121.png <span style='color:#111;'> 1.55MB </span>","children":null,"spread":false},{"title":"pre.png <span style='color:#111;'> 1.97KB </span>","children":null,"spread":false},{"title":"test_postprocessing.py <span style='color:#111;'> 1.49KB </span>","children":null,"spread":false},{"title":"hurricane-florence_00000475_post_disaster.png <span style='color:#111;'> 7.70KB </span>","children":null,"spread":false},{"title":"test_dataset.py <span style='color:#111;'> 3.14KB </span>","children":null,"spread":false},{"title":"guatemala-volcano_00000000_post_disaster.png <span style='color:#111;'> 1.52MB </span>","children":null,"spread":false},{"title":"hurricane-florence_00000115_post_disaster.png <span style='color:#111;'> 9.72KB </span>","children":null,"spread":false},{"title":"post.png <span style='color:#111;'> 2.04KB </span>","children":null,"spread":false},{"title":"test_load_mask.py <span style='color:#111;'> 1.22KB </span>","children":null,"spread":false},{"title":"test_registration.py <span style='color:#111;'> 2.58KB </span>","children":null,"spread":false},{"title":"test_models.py <span style='color:#111;'> 3.70KB </span>","children":null,"spread":false}],"spread":false},{"title":"docker_submission_37.py <span style='color:#111;'> 9.91KB </span>","children":null,"spread":false},{"title":"convert_crops.py <span style='color:#111;'> 3.95KB </span>","children":null,"spread":false},{"title":"LICENSE <span style='color:#111;'> 1.03KB </span>","children":null,"spread":false},{"title":"black.toml <span style='color:#111;'> 524B </span>","children":null,"spread":false},{"title":"run_tensorboard.sh <span style='color:#111;'> 101B </span>","children":null,"spread":false},{"title":"build_push_docker_37.cmd <span style='color:#111;'> 244B </span>","children":null,"spread":false},{"title":"train.csv <span style='color:#111;'> 3.57MB </span>","children":null,"spread":false},{"title":"requirements.txt <span style='color:#111;'> 179B </span>","children":null,"spread":false},{"title":".gitignore <span style='color:#111;'> 1.22KB </span>","children":null,"spread":false},{"title":"make_folds.py <span style='color:#111;'> 1.66KB </span>","children":null,"spread":false},{"title":"requirements_docker_pytorch14.txt <span style='color:#111;'> 387B </span>","children":null,"spread":false},{"title":".dockerignore <span style='color:#111;'> 106B </span>","children":null,"spread":false},{"title":"README.md <span style='color:#111;'> 1.06KB </span>","children":null,"spread":false},{"title":"xview","children":[{"title":"models","children":[{"title":"common.py <span style='color:#111;'> 824B </span>","children":null,"spread":false},{"title":"fpn_v2.py <span style='color:#111;'> 7.26KB </span>","children":null,"spread":false},{"title":"unetv2.py <span style='color:#111;'> 10.45KB </span>","children":null,"spread":false},{"title":"__init__.py <span style='color:#111;'> 1.28KB </span>","children":null,"spread":false},{"title":"cls.py <span style='color:#111;'> 1.79KB </span>","children":null,"spread":false}],"spread":false},{"title":"pseudo.py <span style='color:#111;'> 3.18KB </span>","children":null,"spread":false},{"title":"alignment.py <span style='color:#111;'> 2.82KB </span>","children":null,"spread":false},{"title":"ssim_loss.py <span style='color:#111;'> 2.05KB </span>","children":null,"spread":false},{"title":"train_utils.py <span style='color:#111;'> 2.38KB </span>","children":null,"spread":false},{"title":"optim.py <span style='color:#111;'> 6.93KB </span>","children":null,"spread":false},{"title":"augmentations.py <span style='color:#111;'> 6.68KB </span>","children":null,"spread":false},{"title":"inference.py <span style='color:#111;'> 17.43KB </span>","children":null,"spread":false},{"title":"metric.py <span style='color:#111;'> 8.46KB </span>","children":null,"spread":false},{"title":"dataset.py <span style='color:#111;'> 21.00KB </span>","children":null,"spread":false},{"title":"model_wrapper.py <span style='color:#111;'> 3.44KB </span>","children":null,"spread":false},{"title":"losses.py <span style='color:#111;'> 8.23KB </span>","children":null,"spread":false},{"title":"xview2_metrics.py <span style='color:#111;'> 11.09KB </span>","children":null,"spread":false},{"title":"factory.py <span style='color:#111;'> 4.61KB </span>","children":null,"spread":false},{"title":"postprocessing.py <span style='color:#111;'> 8.49KB </span>","children":null,"spread":false},{"title":"scheduler.py <span style='color:#111;'> 1.67KB </span>","children":null,"spread":false},{"title":"visualization.py <span style='color:#111;'> 4.71KB </span>","children":null,"spread":false},{"title":"rounder.py <span style='color:#111;'> 2.88KB </span>","children":null,"spread":false},{"title":"utils","children":[{"title":"combine_jsons.py <span style='color:#111;'> 4.84KB </span>","children":null,"spread":false},{"title":"inference.sh <span style='color:#111;'> 7.20KB </span>","children":null,"spread":false},{"title":"data_finalize.sh <span style='color:#111;'> 4.67KB </span>","children":null,"spread":false},{"title":"split_into_disasters.py <span style='color:#111;'> 4.88KB </span>","children":null,"spread":false},{"title":"view_polygons.ipynb <span style='color:#111;'> 12.37KB </span>","children":null,"spread":false},{"title":"inference_image_output.py <span style='color:#111;'> 7.73KB </span>","children":null,"spread":false},{"title":"mask_polygons.py <span style='color:#111;'> 11.19KB </span>","children":null,"spread":false}],"spread":false},{"title":"averaging_rounder.py <span style='color:#111;'> 3.19KB </span>","children":null,"spread":false}],"spread":false},{"title":"Dockerfile-pytorch14-37 <span style='color:#111;'> 572B </span>","children":null,"spread":false},{"title":"run_tensorboard.cmd <span style='color:#111;'> 121B </span>","children":null,"spread":false}],"spread":false}],"spread":true}]

评论信息

免责申明

【只为小站】的资源来自网友分享,仅供学习研究,请务必在下载后24小时内给予删除,不得用于其他任何用途,否则后果自负。基于互联网的特殊性,【只为小站】 无法对用户传输的作品、信息、内容的权属或合法性、合规性、真实性、科学性、完整权、有效性等进行实质审查;无论 【只为小站】 经营者是否已进行审查,用户均应自行承担因其传输的作品、信息、内容而可能或已经产生的侵权或权属纠纷等法律责任。
本站所有资源不代表本站的观点或立场,基于网友分享,根据中国法律《信息网络传播权保护条例》第二十二条之规定,若资源存在侵权或相关问题请联系本站客服人员,zhiweidada#qq.com,请把#换成@,本站将给予最大的支持与配合,做到及时反馈和处理。关于更多版权及免责申明参见 版权及免责申明