TextBoxGan:使用GAN从输入单词生成文本框

上传者: 42160252 | 上传时间: 2022-05-25 13:24:48 | 文件大小: 2.8MB | 文件类型: ZIP
文本框 通过生成的对抗网络从输入的单词生成文本框。 视频:在不同的培训步骤中生成“生成”一词: : 图1:使用我们的模型生成“具有相同样式的单词”的不同示例 内容 : 经过训练的模型:经过预先训练的模型(有关该模型的更多详细信息,请参见“部分)。 将该目录放置在目录中。 要使用它,请将EXPERIMENT_NAME = None替换为EXPERIMENT_NAME = "trained model" ,并确保文件中的cfg.resume_step = 225000 。 aster_weights :转换为tf2的 OCR的权重。 将此目录放置在项目的根目录下。 训练模型,运行投影仪和推断测试集是必需的。 perceptual_weights :感知损失的权重,使用回购从pytorch转换而来。 将此目录放置在目录中。 运行投影仪是必需的。 构建码头工人 docker build

文件下载

资源详情

[{"title":"( 84 个子文件 2.8MB ) TextBoxGan:使用GAN从输入单词生成文本框","children":[{"title":"TextBoxGan-main","children":[{"title":"models","children":[{"title":"model_loader.py <span style='color:#111;'> 2.32KB </span>","children":null,"spread":false},{"title":"word_encoder.py <span style='color:#111;'> 2.13KB </span>","children":null,"spread":false},{"title":"losses","children":[{"title":"__pycache__","children":[{"title":"gan_losses.cpython-36.pyc <span style='color:#111;'> 590B </span>","children":null,"spread":false},{"title":"__init__.cpython-38.pyc <span style='color:#111;'> 153B </span>","children":null,"spread":false},{"title":"ocr_losses.cpython-38.pyc <span style='color:#111;'> 674B </span>","children":null,"spread":false},{"title":"ocr_losses.cpython-36.pyc <span style='color:#111;'> 651B </span>","children":null,"spread":false},{"title":"__init__.cpython-36.pyc <span style='color:#111;'> 140B </span>","children":null,"spread":false},{"title":"gan_losses.cpython-38.pyc <span style='color:#111;'> 586B </span>","children":null,"spread":false}],"spread":true},{"title":"__init__.py <span style='color:#111;'> 0B </span>","children":null,"spread":false},{"title":"ocr_losses.py <span style='color:#111;'> 386B </span>","children":null,"spread":false},{"title":"gan_losses.py <span style='color:#111;'> 327B </span>","children":null,"spread":false}],"spread":true},{"title":"__init__.py <span style='color:#111;'> 0B </span>","children":null,"spread":false},{"title":"stylegan2","children":[{"title":"utils.py <span style='color:#111;'> 3.74KB </span>","children":null,"spread":false},{"title":"latent_encoder.py <span style='color:#111;'> 3.25KB </span>","children":null,"spread":false},{"title":"generator.py <span style='color:#111;'> 1.87KB </span>","children":null,"spread":false},{"title":"discriminator.py <span style='color:#111;'> 6.32KB </span>","children":null,"spread":false},{"title":"layers","children":[{"title":"cuda","children":[{"title":"fused_bias_act.py <span style='color:#111;'> 8.64KB </span>","children":null,"spread":false},{"title":"custom_ops.py <span style='color:#111;'> 7.73KB </span>","children":null,"spread":false},{"title":"upfirdn_2d.cu <span style='color:#111;'> 14.79KB </span>","children":null,"spread":false},{"title":"fused_bias_act.cu <span style='color:#111;'> 7.61KB </span>","children":null,"spread":false},{"title":"upfirdn_2d.py <span style='color:#111;'> 17.36KB </span>","children":null,"spread":false},{"title":"upfirdn_2d_v2.py <span style='color:#111;'> 5.46KB </span>","children":null,"spread":false},{"title":"__init__.py <span style='color:#111;'> 235B </span>","children":null,"spread":false}],"spread":true},{"title":"commons.py <span style='color:#111;'> 529B </span>","children":null,"spread":false},{"title":"to_rgb.py <span style='color:#111;'> 1.08KB </span>","children":null,"spread":false},{"title":"from_rgb.py <span style='color:#111;'> 1019B </span>","children":null,"spread":false},{"title":"mapping_block.py <span style='color:#111;'> 1.68KB </span>","children":null,"spread":false},{"title":"__init__.py <span style='color:#111;'> 0B </span>","children":null,"spread":false},{"title":"mini_batch_std.py <span style='color:#111;'> 1.29KB </span>","children":null,"spread":false},{"title":"conv.py <span style='color:#111;'> 2.53KB </span>","children":null,"spread":false},{"title":"dense.py <span style='color:#111;'> 1.23KB </span>","children":null,"spread":false},{"title":"noise.py <span style='color:#111;'> 829B </span>","children":null,"spread":false},{"title":"modulated_conv2d.py <span style='color:#111;'> 4.48KB </span>","children":null,"spread":false},{"title":"synthesis_block.py <span style='color:#111;'> 5.12KB </span>","children":null,"spread":false},{"title":"bias_act.py <span style='color:#111;'> 2.17KB </span>","children":null,"spread":false}],"spread":false}],"spread":true}],"spread":true},{"title":"allow_memory_growth.py <span style='color:#111;'> 683B </span>","children":null,"spread":false},{"title":"config","children":[{"title":"__init__.py <span style='color:#111;'> 30B </span>","children":null,"spread":false},{"title":"config.py <span style='color:#111;'> 4.05KB </span>","children":null,"spread":false},{"title":"char_tokens.py <span style='color:#111;'> 661B </span>","children":null,"spread":false}],"spread":true},{"title":"validation_step.py <span style='color:#111;'> 2.60KB </span>","children":null,"spread":false},{"title":"train.py <span style='color:#111;'> 8.88KB </span>","children":null,"spread":false},{"title":"projector","children":[{"title":"perceptual_weights","children":[{"title":"lin","children":[{"title":".keep <span style='color:#111;'> 0B </span>","children":null,"spread":false}],"spread":true},{"title":"vgg","children":[{"title":".keep <span style='color:#111;'> 0B </span>","children":null,"spread":false}],"spread":true}],"spread":true},{"title":"lpips_tensorflow.py <span style='color:#111;'> 6.60KB </span>","children":null,"spread":false},{"title":"projector.py <span style='color:#111;'> 10.01KB </span>","children":null,"spread":false}],"spread":true},{"title":"Dockerfile <span style='color:#111;'> 336B </span>","children":null,"spread":false},{"title":"training_step.py <span style='color:#111;'> 14.02KB </span>","children":null,"spread":false},{"title":"LICENSE <span style='color:#111;'> 1.04KB </span>","children":null,"spread":false},{"title":"experiments","children":[{"title":".keep <span style='color:#111;'> 0B </span>","children":null,"spread":false}],"spread":true},{"title":"requirements.txt <span style='color:#111;'> 109B </span>","children":null,"spread":false},{"title":"aster_weights","children":[{"title":".keep <span style='color:#111;'> 0B </span>","children":null,"spread":false}],"spread":true},{"title":".gitignore <span style='color:#111;'> 211B </span>","children":null,"spread":false},{"title":"Makefile <span style='color:#111;'> 1.75KB </span>","children":null,"spread":false},{"title":"dataset_utils","children":[{"title":"text_box_dataset_metrics.py <span style='color:#111;'> 1.83KB </span>","children":null,"spread":false},{"title":"training_data_loader.py <span style='color:#111;'> 3.28KB </span>","children":null,"spread":false},{"title":"__pycache__","children":[{"title":"validation_data_loader.cpython-38.pyc <span style='color:#111;'> 1.70KB </span>","children":null,"spread":false},{"title":"__init__.cpython-38.pyc <span style='color:#111;'> 153B </span>","children":null,"spread":false},{"title":"validation_data_loader.cpython-36.pyc <span style='color:#111;'> 1.66KB </span>","children":null,"spread":false},{"title":"training_data_loader.cpython-36.pyc <span style='color:#111;'> 2.85KB </span>","children":null,"spread":false},{"title":"text_box_dataset_maker.cpython-38.pyc <span style='color:#111;'> 2.92KB </span>","children":null,"spread":false},{"title":"__init__.cpython-36.pyc <span style='color:#111;'> 140B </span>","children":null,"spread":false},{"title":"filter_out_bad_images.cpython-38.pyc <span style='color:#111;'> 1.62KB </span>","children":null,"spread":false}],"spread":false},{"title":"__init__.py <span style='color:#111;'> 0B </span>","children":null,"spread":false},{"title":"text_corpus_dataset_maker.py <span style='color:#111;'> 5.63KB </span>","children":null,"spread":false},{"title":"validation_data_loader.py <span style='color:#111;'> 1.31KB </span>","children":null,"spread":false},{"title":"text_box_dataset_maker.py <span style='color:#111;'> 2.18KB </span>","children":null,"spread":false},{"title":"filter_out_bad_images.py <span style='color:#111;'> 1.90KB </span>","children":null,"spread":false}],"spread":false},{"title":"ReadMe_images","children":[{"title":"character_distribution.png <span style='color:#111;'> 119.01KB </span>","children":null,"spread":false},{"title":"words_with_the_same_style.png <span style='color:#111;'> 1.43MB </span>","children":null,"spread":false},{"title":"losses_comparison.png <span style='color:#111;'> 422.08KB </span>","children":null,"spread":false},{"title":"word_encoder.png <span style='color:#111;'> 23.52KB </span>","children":null,"spread":false},{"title":"swapping_labels.png <span style='color:#111;'> 162.13KB </span>","children":null,"spread":false},{"title":"projected_images.png <span style='color:#111;'> 433.92KB </span>","children":null,"spread":false},{"title":"generating_words_mse_vs_sce.png <span style='color:#111;'> 141.18KB </span>","children":null,"spread":false},{"title":"network.png <span style='color:#111;'> 144.82KB </span>","children":null,"spread":false}],"spread":false},{"title":"infere.py <span style='color:#111;'> 6.52KB </span>","children":null,"spread":false},{"title":"README.md <span style='color:#111;'> 17.17KB </span>","children":null,"spread":false},{"title":"aster_ocr_utils","children":[{"title":"aster_inferer.py <span style='color:#111;'> 6.17KB </span>","children":null,"spread":false},{"title":"aster_tester.py <span style='color:#111;'> 1.31KB </span>","children":null,"spread":false},{"title":"weigths_tf1_to_tf2.py <span style='color:#111;'> 1.71KB </span>","children":null,"spread":false}],"spread":false},{"title":"utils","children":[{"title":"loss_tracker.py <span style='color:#111;'> 1.97KB </span>","children":null,"spread":false},{"title":"utils.py <span style='color:#111;'> 2.94KB </span>","children":null,"spread":false},{"title":"tensorboard_writer.py <span style='color:#111;'> 5.99KB </span>","children":null,"spread":false},{"title":"__init__.py <span style='color:#111;'> 98B </span>","children":null,"spread":false}],"spread":false}],"spread":false}],"spread":true}]

评论信息

免责申明

【只为小站】的资源来自网友分享,仅供学习研究,请务必在下载后24小时内给予删除,不得用于其他任何用途,否则后果自负。基于互联网的特殊性,【只为小站】 无法对用户传输的作品、信息、内容的权属或合法性、合规性、真实性、科学性、完整权、有效性等进行实质审查;无论 【只为小站】 经营者是否已进行审查,用户均应自行承担因其传输的作品、信息、内容而可能或已经产生的侵权或权属纠纷等法律责任。
本站所有资源不代表本站的观点或立场,基于网友分享,根据中国法律《信息网络传播权保护条例》第二十二条之规定,若资源存在侵权或相关问题请联系本站客服人员,zhiweidada#qq.com,请把#换成@,本站将给予最大的支持与配合,做到及时反馈和处理。关于更多版权及免责申明参见 版权及免责申明