从单个图像进行单眼深度估计-Python开发

上传者: 42114041 | 上传时间: 2022-05-01 16:40:44 | 文件大小: 9.94MB | 文件类型: ZIP
Monodepth2这是用于训练和测试深度估计模型的参考PyTorch实施,使用挖掘为自我监督的单眼深度预测ClémentGodard,Oisin Mac Aodha,Monodepth2中描述的方法,这是使用PyTorch的参考PyTorch实施,用于训练和测试深度估计模型挖掘到自我监督的单眼深度预测ClémentGodard,Oisin Mac Aodha,Michael Firman和Gabriel J.Brostow ICCV 2019中描述的方法 请参阅许可文件中的条款。 如果您发现我们的工作对您的研究有用,请考虑引用我们的论文:@article {monodepth2,title = {

文件下载

资源详情

[{"title":"( 46 个子文件 9.94MB ) 从单个图像进行单眼深度估计-Python开发","children":[{"title":"monodepth2-master","children":[{"title":"trainer.py <span style='color:#111;'> 25.23KB </span>","children":null,"spread":false},{"title":".gitignore <span style='color:#111;'> 64B </span>","children":null,"spread":false},{"title":"datasets","children":[{"title":"__init__.py <span style='color:#111;'> 80B </span>","children":null,"spread":false},{"title":"kitti_dataset.py <span style='color:#111;'> 4.58KB </span>","children":null,"spread":false},{"title":"mono_dataset.py <span style='color:#111;'> 6.96KB </span>","children":null,"spread":false}],"spread":true},{"title":"evaluate_depth.py <span style='color:#111;'> 8.05KB </span>","children":null,"spread":false},{"title":"kitti_utils.py <span style='color:#111;'> 3.47KB </span>","children":null,"spread":false},{"title":"experiments","children":[{"title":"odom_experiments.sh <span style='color:#111;'> 975B </span>","children":null,"spread":false},{"title":"mono_experiments.sh <span style='color:#111;'> 1.14KB </span>","children":null,"spread":false},{"title":"stereo_experiments.sh <span style='color:#111;'> 972B </span>","children":null,"spread":false},{"title":"mono+stereo_experiments.sh <span style='color:#111;'> 1.35KB </span>","children":null,"spread":false}],"spread":true},{"title":"networks","children":[{"title":"resnet_encoder.py <span style='color:#111;'> 3.95KB </span>","children":null,"spread":false},{"title":"__init__.py <span style='color:#111;'> 150B </span>","children":null,"spread":false},{"title":"depth_decoder.py <span style='color:#111;'> 2.17KB </span>","children":null,"spread":false},{"title":"pose_decoder.py <span style='color:#111;'> 1.83KB </span>","children":null,"spread":false},{"title":"pose_cnn.py <span style='color:#111;'> 1.47KB </span>","children":null,"spread":false}],"spread":true},{"title":"assets","children":[{"title":"copyright_notice.txt <span style='color:#111;'> 300B </span>","children":null,"spread":false},{"title":"test_image.jpg <span style='color:#111;'> 82.45KB </span>","children":null,"spread":false},{"title":"teaser.gif <span style='color:#111;'> 8.81MB </span>","children":null,"spread":false}],"spread":true},{"title":"LICENSE <span style='color:#111;'> 7.91KB </span>","children":null,"spread":false},{"title":"options.py <span style='color:#111;'> 10.34KB </span>","children":null,"spread":false},{"title":"layers.py <span style='color:#111;'> 7.96KB </span>","children":null,"spread":false},{"title":".github","children":[{"title":"ISSUE_TEMPLATE","children":[{"title":"training-on-custom-training-data.md <span style='color:#111;'> 1.58KB </span>","children":null,"spread":false},{"title":"problem-training-on-kitti.md <span style='color:#111;'> 516B </span>","children":null,"spread":false}],"spread":true}],"spread":true},{"title":"utils.py <span style='color:#111;'> 4.21KB </span>","children":null,"spread":false},{"title":"README.md <span style='color:#111;'> 14.13KB </span>","children":null,"spread":false},{"title":"export_gt_depth.py <span style='color:#111;'> 2.22KB </span>","children":null,"spread":false},{"title":"splits","children":[{"title":"odom","children":[{"title":"test_files_09.txt <span style='color:#111;'> 12.89KB </span>","children":null,"spread":false},{"title":"val_files.txt <span style='color:#111;'> 34.00KB </span>","children":null,"spread":false},{"title":"train_files.txt <span style='color:#111;'> 306.53KB </span>","children":null,"spread":false},{"title":"test_files_10.txt <span style='color:#111;'> 10.63KB </span>","children":null,"spread":false}],"spread":false},{"title":"eigen","children":[{"title":"test_files.txt <span style='color:#111;'> 34.71KB </span>","children":null,"spread":false}],"spread":false},{"title":"eigen_benchmark","children":[{"title":"test_files.txt <span style='color:#111;'> 27.80KB </span>","children":null,"spread":false}],"spread":false},{"title":"eigen_full","children":[{"title":"val_files.txt <span style='color:#111;'> 76.62KB </span>","children":null,"spread":false},{"title":"train_files.txt <span style='color:#111;'> 1.91MB </span>","children":null,"spread":false}],"spread":false},{"title":"kitti_archives_to_download.txt <span style='color:#111;'> 6.93KB </span>","children":null,"spread":false},{"title":"benchmark","children":[{"title":"eigen_to_benchmark_ids.npy <span style='color:#111;'> 5.22KB </span>","children":null,"spread":false},{"title":"val_files.txt <span style='color:#111;'> 258.50KB </span>","children":null,"spread":false},{"title":"train_files.txt <span style='color:#111;'> 3.04MB </span>","children":null,"spread":false},{"title":"test_files.txt <span style='color:#111;'> 4.78KB </span>","children":null,"spread":false}],"spread":false},{"title":"eigen_zhou","children":[{"title":"val_files.txt <span style='color:#111;'> 191.21KB </span>","children":null,"spread":false},{"title":"train_files.txt <span style='color:#111;'> 1.68MB </span>","children":null,"spread":false}],"spread":false}],"spread":false},{"title":"evaluate_pose.py <span style='color:#111;'> 4.74KB </span>","children":null,"spread":false},{"title":"test_simple.py <span style='color:#111;'> 6.73KB </span>","children":null,"spread":false},{"title":"depth_prediction_example.ipynb <span style='color:#111;'> 357.43KB </span>","children":null,"spread":false},{"title":"train.py <span style='color:#111;'> 507B </span>","children":null,"spread":false}],"spread":false}],"spread":true}]

评论信息

免责申明

【只为小站】的资源来自网友分享,仅供学习研究,请务必在下载后24小时内给予删除,不得用于其他任何用途,否则后果自负。基于互联网的特殊性,【只为小站】 无法对用户传输的作品、信息、内容的权属或合法性、合规性、真实性、科学性、完整权、有效性等进行实质审查;无论 【只为小站】 经营者是否已进行审查,用户均应自行承担因其传输的作品、信息、内容而可能或已经产生的侵权或权属纠纷等法律责任。
本站所有资源不代表本站的观点或立场,基于网友分享,根据中国法律《信息网络传播权保护条例》第二十二条之规定,若资源存在侵权或相关问题请联系本站客服人员,zhiweidada#qq.com,请把#换成@,本站将给予最大的支持与配合,做到及时反馈和处理。关于更多版权及免责申明参见 版权及免责申明