颜色分类leetcode-NIH-Chest-X-rays-Classification:NIH-胸部-X射线-分类

上传者: 38516658 | 上传时间: 2022-04-24 18:56:42 | 文件大小: 32.4MB | 文件类型: ZIP
颜色分类leetcode NIH-胸部-X 射线-分类设置 这个加州大学伯克利分校数据科学信息硕士 W207 最终项目是由 、 和 开发的。 目录 项目概况 该项目旨在通过使用深度神经网络架构对 NIH 胸部 X 射线数据集进行分类。 我们通过增量步骤优化我们的模型。 我们首先调整超参数,然后尝试不同的架构,最终创建我们的最终模型。 该项目背后的动机是复制或改进以下论文中列出的结果:. 该项目的工作流程基于 Chahhou Mohammed 制定的工作流程,他是 Kaggle 100 万美元 Zillow 数据集价格预测奖的获得者。 他系统地构建了一个简单的模型,并在对超参数执行网格搜索的同时逐渐增加了更多的复杂性。 在这里,我们将对 NIH 胸部 X 射线图像的 Kaggle 数据集执行相同的任务。 该数据集由 NIH 收集,包含来自 30,000 多名患者的超过 100,000 张匿名胸部 X 射线图像。 数据代表放射学报告的 NLP 分析,可能包括诊断可信度较低的区域。 作为一个简化的假设,我们假设基于数据集的大小,数据集在诊断中是准确的。 此问题的难点之一涉及数据中缺乏“诊断置

文件下载

资源详情

[{"title":"( 75 个子文件 32.4MB ) 颜色分类leetcode-NIH-Chest-X-rays-Classification:NIH-胸部-X射线-分类","children":[{"title":"NIH-Chest-X-rays-Classification-master","children":[{"title":"src","children":[{"title":"ensemble_model.ipynb <span style='color:#111;'> 863.24KB </span>","children":null,"spread":false},{"title":"train.py <span style='color:#111;'> 16.25KB </span>","children":null,"spread":false},{"title":"reset.py <span style='color:#111;'> 775B </span>","children":null,"spread":false},{"title":"v3-train-simple-xray-cnn-multi-binarizer-Image-Size.ipynb <span style='color:#111;'> 1.45MB </span>","children":null,"spread":false},{"title":"utils.py <span style='color:#111;'> 5.69KB </span>","children":null,"spread":false},{"title":"ROCs.ipynb <span style='color:#111;'> 5.61KB </span>","children":null,"spread":false},{"title":"data_preparation.py <span style='color:#111;'> 3.28KB </span>","children":null,"spread":false},{"title":"keys","children":[{"title":"biseda_id_rsa.pub <span style='color:#111;'> 391B </span>","children":null,"spread":false}],"spread":true},{"title":"v2-train-simple-xray-cnn-multi-binarizer.ipynb <span style='color:#111;'> 1006.49KB </span>","children":null,"spread":false},{"title":"attention-maps.ipynb <span style='color:#111;'> 10.08KB </span>","children":null,"spread":false},{"title":"gradient_accumulation.py <span style='color:#111;'> 4.55KB </span>","children":null,"spread":false},{"title":"train-simple-xray-cnn-multi-binarizer.ipynb <span style='color:#111;'> 1.29MB </span>","children":null,"spread":false},{"title":"params.py <span style='color:#111;'> 739B </span>","children":null,"spread":false}],"spread":false},{"title":"images","children":[{"title":"adjusted_frequencies.png <span style='color:#111;'> 23.29KB </span>","children":null,"spread":false},{"title":"barely_trained_net.png <span style='color:#111;'> 93.25KB </span>","children":null,"spread":false},{"title":"all_model_validation_accuracy.png <span style='color:#111;'> 287.68KB </span>","children":null,"spread":false},{"title":"image_size_table.png <span style='color:#111;'> 13.97KB </span>","children":null,"spread":false},{"title":"image_size_comparison.png <span style='color:#111;'> 74.81KB </span>","children":null,"spread":false},{"title":"all_model_validation_accuracy.1.png <span style='color:#111;'> 257.19KB </span>","children":null,"spread":false},{"title":"tensorboard_graph_example.png <span style='color:#111;'> 194.37KB </span>","children":null,"spread":false},{"title":"all_model_validation_loss.png <span style='color:#111;'> 231.56KB </span>","children":null,"spread":false},{"title":"all_model_validation_loss.1.png <span style='color:#111;'> 210.06KB </span>","children":null,"spread":false},{"title":"all_model_tensorboard.png <span style='color:#111;'> 151.12KB </span>","children":null,"spread":false},{"title":"nn architecture comparisons.png <span style='color:#111;'> 178.63KB </span>","children":null,"spread":false},{"title":"attention_improvement.png <span style='color:#111;'> 58.75KB </span>","children":null,"spread":false},{"title":"high_confidence_diagnoses.png <span style='color:#111;'> 603.77KB </span>","children":null,"spread":false},{"title":"all_diagnoses.png <span style='color:#111;'> 23.72KB </span>","children":null,"spread":false},{"title":"simple_model_keras.png <span style='color:#111;'> 14.34KB </span>","children":null,"spread":false},{"title":"all_model_validation_accuracy.2.png <span style='color:#111;'> 442.10KB </span>","children":null,"spread":false},{"title":"paper correlation of diagnoses.png <span style='color:#111;'> 216.27KB </span>","children":null,"spread":false},{"title":"hard_vote_ensemble.png <span style='color:#111;'> 90.44KB </span>","children":null,"spread":false},{"title":"ensemble_results_table.png <span style='color:#111;'> 25.02KB </span>","children":null,"spread":false},{"title":"rgb_vs_grayscale.png <span style='color:#111;'> 52.87KB </span>","children":null,"spread":false},{"title":"table of architecture results.png <span style='color:#111;'> 224.65KB </span>","children":null,"spread":false},{"title":"attention_map.png <span style='color:#111;'> 3.60MB </span>","children":null,"spread":false},{"title":"mobilenet.png <span style='color:#111;'> 172.91KB </span>","children":null,"spread":false},{"title":"attention_map.1.png <span style='color:#111;'> 2.31MB </span>","children":null,"spread":false},{"title":"attention_map.2.png <span style='color:#111;'> 3.33MB </span>","children":null,"spread":false},{"title":"nvidia_tx2.png <span style='color:#111;'> 203.89KB </span>","children":null,"spread":false},{"title":"gradient_accumulation_and_learning_rate.png <span style='color:#111;'> 13.37KB </span>","children":null,"spread":false},{"title":"mobilenet_arch.png <span style='color:#111;'> 49.05KB </span>","children":null,"spread":false},{"title":"all_model_validation_loss.2.png <span style='color:#111;'> 493.93KB </span>","children":null,"spread":false},{"title":"cnn_high_level.jpeg <span style='color:#111;'> 86.94KB </span>","children":null,"spread":false},{"title":"W207_Project_Roadmap.png <span style='color:#111;'> 51.59KB </span>","children":null,"spread":false},{"title":"weighted_ensemble_roc.png <span style='color:#111;'> 84.61KB </span>","children":null,"spread":false},{"title":"resnet_block.png <span style='color:#111;'> 18.26KB </span>","children":null,"spread":false},{"title":"tx2_roc_mobilenet.png <span style='color:#111;'> 91.34KB </span>","children":null,"spread":false},{"title":"clean_categories.png <span style='color:#111;'> 24.16KB </span>","children":null,"spread":false},{"title":"optimizer_selection_original.png <span style='color:#111;'> 58.72KB </span>","children":null,"spread":false},{"title":"journal.pone.0029740.g001.png <span style='color:#111;'> 849.79KB </span>","children":null,"spread":false},{"title":"resnet_arch.png <span style='color:#111;'> 184.55KB </span>","children":null,"spread":false},{"title":"vgg_arch.png <span style='color:#111;'> 131.41KB </span>","children":null,"spread":false},{"title":"multi_tensorboard.png <span style='color:#111;'> 130.50KB </span>","children":null,"spread":false},{"title":"lenet_arch.jpg <span style='color:#111;'> 41.45KB </span>","children":null,"spread":false}],"spread":false},{"title":".vscode","children":[{"title":"launch.json <span style='color:#111;'> 500B </span>","children":null,"spread":false},{"title":"settings.json <span style='color:#111;'> 52B </span>","children":null,"spread":false}],"spread":true},{"title":"keys","children":[{"title":"spyros_id_rsa.pub <span style='color:#111;'> 401B </span>","children":null,"spread":false},{"title":"biseda_id_rsa.pub <span style='color:#111;'> 391B </span>","children":null,"spread":false}],"spread":true},{"title":"tx2","children":[{"title":"inference.py <span style='color:#111;'> 6.38KB </span>","children":null,"spread":false},{"title":"tx2-down.sh <span style='color:#111;'> 50B </span>","children":null,"spread":false},{"title":"mobilenetInferenceDF.csv <span style='color:#111;'> 399B </span>","children":null,"spread":false},{"title":"xray_class_weights.best.hdf5 <span style='color:#111;'> 14.46MB </span>","children":null,"spread":false},{"title":"cl_mobilenetInferenceDF.csv <span style='color:#111;'> 403B </span>","children":null,"spread":false},{"title":"inference.ipynb <span style='color:#111;'> 136.49KB </span>","children":null,"spread":false},{"title":"Dockerfile.tx2_tensorflow <span style='color:#111;'> 703B </span>","children":null,"spread":false},{"title":"tx2-up.sh <span style='color:#111;'> 669B </span>","children":null,"spread":false}],"spread":true},{"title":"LICENSE <span style='color:#111;'> 1.05KB </span>","children":null,"spread":false},{"title":"README.md <span style='color:#111;'> 26.27KB </span>","children":null,"spread":false},{"title":"docs","children":[{"title":"Wang_ChestX-ray8_Hospital-Scale_Chest_CVPR_2017_paper.pdf <span style='color:#111;'> 1.57MB </span>","children":null,"spread":false},{"title":"W207_Project_Roadmap.pptx <span style='color:#111;'> 47.29KB </span>","children":null,"spread":false},{"title":"W207_Project_Roadmap.png <span style='color:#111;'> 52.08KB </span>","children":null,"spread":false}],"spread":true},{"title":"Dockerfile.dev <span style='color:#111;'> 1.86KB </span>","children":null,"spread":false},{"title":".gitignore <span style='color:#111;'> 1.22KB </span>","children":null,"spread":false},{"title":"setup.sh <span style='color:#111;'> 614B </span>","children":null,"spread":false},{"title":"requirements.pip <span style='color:#111;'> 2.17KB </span>","children":null,"spread":false}],"spread":false}],"spread":true}]

评论信息

免责申明

【只为小站】的资源来自网友分享,仅供学习研究,请务必在下载后24小时内给予删除,不得用于其他任何用途,否则后果自负。基于互联网的特殊性,【只为小站】 无法对用户传输的作品、信息、内容的权属或合法性、合规性、真实性、科学性、完整权、有效性等进行实质审查;无论 【只为小站】 经营者是否已进行审查,用户均应自行承担因其传输的作品、信息、内容而可能或已经产生的侵权或权属纠纷等法律责任。
本站所有资源不代表本站的观点或立场,基于网友分享,根据中国法律《信息网络传播权保护条例》第二十二条之规定,若资源存在侵权或相关问题请联系本站客服人员,zhiweidada#qq.com,请把#换成@,本站将给予最大的支持与配合,做到及时反馈和处理。关于更多版权及免责申明参见 版权及免责申明