9.基于cGAN的pix2pix 模型与自动上色技术 python代码实现

上传者: 43471818 | 上传时间: 2019-12-21 21:48:23 | 文件大小: 12.06MB | 文件类型: zip
基于深度对抗网络,建立pix2pix 模型,实现对目标对象的自动上色。python语言编写代码实现

文件下载

资源详情

[{"title":"( 68 个子文件 12.06MB ) 9.基于cGAN的pix2pix 模型与自动上色技术 python代码实现","children":[{"title":"chapter_9","children":[{"title":".gitignore <span style='color:#111;'> 89B </span>","children":null,"spread":false},{"title":"README.md <span style='color:#111;'> 2.36KB </span>","children":null,"spread":false},{"title":"pix2pix.py <span style='color:#111;'> 34.90KB </span>","children":null,"spread":false},{"title":"docs","children":[{"title":"5-tensorflow.png <span style='color:#111;'> 95.53KB </span>","children":null,"spread":false},{"title":"1-inputs.png <span style='color:#111;'> 24.95KB </span>","children":null,"spread":false},{"title":"1-tensorflow.png <span style='color:#111;'> 97.63KB </span>","children":null,"spread":false},{"title":"ab.png <span style='color:#111;'> 12.97KB </span>","children":null,"spread":false},{"title":"5-inputs.png <span style='color:#111;'> 17.96KB </span>","children":null,"spread":false},{"title":"combine.png <span style='color:#111;'> 1.31MB </span>","children":null,"spread":false},{"title":"1-torch.jpg <span style='color:#111;'> 10.38KB </span>","children":null,"spread":false},{"title":"tensorboard-graph.png <span style='color:#111;'> 448.02KB </span>","children":null,"spread":false},{"title":"tensorboard-image.png <span style='color:#111;'> 347.29KB </span>","children":null,"spread":false},{"title":"95-inputs.png <span style='color:#111;'> 34.51KB </span>","children":null,"spread":false},{"title":"maps.jpg <span style='color:#111;'> 140.55KB </span>","children":null,"spread":false},{"title":"test-html.png <span style='color:#111;'> 4.18MB </span>","children":null,"spread":false},{"title":"cityscapes.jpg <span style='color:#111;'> 31.37KB </span>","children":null,"spread":false},{"title":"51-targets.png <span style='color:#111;'> 98.05KB </span>","children":null,"spread":false},{"title":"examples.jpg <span style='color:#111;'> 469.13KB </span>","children":null,"spread":false},{"title":"51-tensorflow.png <span style='color:#111;'> 109.64KB </span>","children":null,"spread":false},{"title":"tensorboard-scalar.png <span style='color:#111;'> 276.69KB </span>","children":null,"spread":false},{"title":"95-targets.png <span style='color:#111;'> 79.40KB </span>","children":null,"spread":false},{"title":"1-targets.png <span style='color:#111;'> 98.68KB </span>","children":null,"spread":false},{"title":"95-torch.jpg <span style='color:#111;'> 12.28KB </span>","children":null,"spread":false},{"title":"5-targets.png <span style='color:#111;'> 93.21KB </span>","children":null,"spread":false},{"title":"facades.jpg <span style='color:#111;'> 46.97KB </span>","children":null,"spread":false},{"title":"edges2shoes.jpg <span style='color:#111;'> 34.40KB </span>","children":null,"spread":false},{"title":"95-tensorflow.png <span style='color:#111;'> 111.15KB </span>","children":null,"spread":false},{"title":"51-torch.jpg <span style='color:#111;'> 12.73KB </span>","children":null,"spread":false},{"title":"edges2handbags.jpg <span style='color:#111;'> 27.49KB </span>","children":null,"spread":false},{"title":"51-inputs.png <span style='color:#111;'> 50.01KB </span>","children":null,"spread":false},{"title":"5-torch.jpg <span style='color:#111;'> 8.32KB </span>","children":null,"spread":false},{"title":"418.png <span style='color:#111;'> 115.18KB </span>","children":null,"spread":false}],"spread":false},{"title":"server","children":[{"title":"Dockerfile <span style='color:#111;'> 853B </span>","children":null,"spread":false},{"title":"README.md <span style='color:#111;'> 7.00KB </span>","children":null,"spread":false},{"title":"static","children":[{"title":"facades-sheet.jpg <span style='color:#111;'> 1.27MB </span>","children":null,"spread":false},{"title":"facades-input.png <span style='color:#111;'> 40.27KB </span>","children":null,"spread":false},{"title":"facades-output.png <span style='color:#111;'> 117.04KB </span>","children":null,"spread":false},{"title":"edges2cats-output.png <span style='color:#111;'> 59.00KB </span>","children":null,"spread":false},{"title":"edges2cats-sheet.jpg <span style='color:#111;'> 807.14KB </span>","children":null,"spread":false},{"title":"edges2cats-input.png <span style='color:#111;'> 3.16KB </span>","children":null,"spread":false},{"title":"edges2handbags-input.png <span style='color:#111;'> 4.70KB </span>","children":null,"spread":false},{"title":"edges2handbags-output.png <span style='color:#111;'> 74.20KB </span>","children":null,"spread":false},{"title":"edges2handbags-sheet.jpg <span style='color:#111;'> 932.95KB </span>","children":null,"spread":false},{"title":"editor.png <span style='color:#111;'> 34.53KB </span>","children":null,"spread":false},{"title":"edges2shoes-sheet.jpg <span style='color:#111;'> 676.56KB </span>","children":null,"spread":false},{"title":"index.html <span style='color:#111;'> 20.66KB </span>","children":null,"spread":false},{"title":"edges2shoes-output.png <span style='color:#111;'> 56.82KB </span>","children":null,"spread":false},{"title":"edges2shoes-input.png <span style='color:#111;'> 3.36KB </span>","children":null,"spread":false}],"spread":false},{"title":"deployment.tf <span style='color:#111;'> 3.32KB </span>","children":null,"spread":false},{"title":"tools","children":[{"title":"export-example-model.py <span style='color:#111;'> 1.43KB </span>","children":null,"spread":false},{"title":"rolling-update.py <span style='color:#111;'> 817B </span>","children":null,"spread":false},{"title":"process-cloud.py <span style='color:#111;'> 1.65KB </span>","children":null,"spread":false},{"title":"process-local.py <span style='color:#111;'> 1.62KB </span>","children":null,"spread":false},{"title":"process-remote.py <span style='color:#111;'> 766B </span>","children":null,"spread":false},{"title":"upload-image.py <span style='color:#111;'> 810B </span>","children":null,"spread":false},{"title":"upload-model.py <span style='color:#111;'> 3.82KB </span>","children":null,"spread":false}],"spread":true},{"title":"terraform.tfvars.example <span style='color:#111;'> 105B </span>","children":null,"spread":false},{"title":"serve.py <span style='color:#111;'> 11.31KB </span>","children":null,"spread":false}],"spread":true},{"title":"docker","children":[{"title":"Dockerfile <span style='color:#111;'> 4.16KB </span>","children":null,"spread":false}],"spread":true},{"title":"README_eng.md <span style='color:#111;'> 10.59KB </span>","children":null,"spread":false},{"title":"LICENSE.txt <span style='color:#111;'> 1.05KB </span>","children":null,"spread":false},{"title":"tools","children":[{"title":"split.py <span style='color:#111;'> 1.44KB </span>","children":null,"spread":false},{"title":"dockrun.py <span style='color:#111;'> 3.59KB </span>","children":null,"spread":false},{"title":"test.py <span style='color:#111;'> 3.66KB </span>","children":null,"spread":false},{"title":"process.py <span style='color:#111;'> 9.41KB </span>","children":null,"spread":false},{"title":"process_out_jpg.py <span style='color:#111;'> 9.41KB </span>","children":null,"spread":false},{"title":"tfimage.py <span style='color:#111;'> 3.49KB </span>","children":null,"spread":false},{"title":"download-dataset.py <span style='color:#111;'> 645B </span>","children":null,"spread":false}],"spread":true}],"spread":true}],"spread":true}]

评论信息

  • monday12138 :
    一直给报错。
    2020-08-17
  • Y丶OU :
    一直给报错。
    2020-08-17

免责申明

【只为小站】的资源来自网友分享,仅供学习研究,请务必在下载后24小时内给予删除,不得用于其他任何用途,否则后果自负。基于互联网的特殊性,【只为小站】 无法对用户传输的作品、信息、内容的权属或合法性、合规性、真实性、科学性、完整权、有效性等进行实质审查;无论 【只为小站】 经营者是否已进行审查,用户均应自行承担因其传输的作品、信息、内容而可能或已经产生的侵权或权属纠纷等法律责任。
本站所有资源不代表本站的观点或立场,基于网友分享,根据中国法律《信息网络传播权保护条例》第二十二条之规定,若资源存在侵权或相关问题请联系本站客服人员,zhiweidada#qq.com,请把#换成@,本站将给予最大的支持与配合,做到及时反馈和处理。关于更多版权及免责申明参见 版权及免责申明