在UCF101上使用3D CNN/CNN + RNN进行视频分类/动作识别的教程-python源码

上传者: 42106299 | 上传时间: 2021-09-01 13:46:32 | 文件大小: 8.78MB | 文件类型: ZIP
在UCF101上使用3D CNN/CNN + RNN进行视频分类/动作识别的教程 视频分类 该存储库使用 UCF101 和 PyTorch 为视频分类(或动作识别)构建了一个快速而简单的代码。 视频被视为一个 3D 图像或几个连续的 2D 图像(图 1)。 下面是两个简单的神经网络模型:数据集 UCF101 共有来自 101 个动作的 13,320 个视频。 视频具有不同的时间长度(帧)和不同的 2d 图像大小; 最短的是28帧。 为了避免像 OpenCV 或 FFmpeg 这样的帧提取和转换等痛苦的视频预处理,这里我直接使用了来自 feichtenhofer 的预处理数据集。 如果您想从头开始转换或提取视频帧,这里有一些不错的教程:https://pythonprogramming.net/loading-video-python-opencv-tutorial/ https://www.pyimagesearch.com/2017/02/ 06/faster-video-file-fps-with-cv2-videocapture-and-opencv/ 模型 1. 3

文件下载

资源详情

[{"title":"( 82 个子文件 8.78MB ) 在UCF101上使用3D CNN/CNN + RNN进行视频分类/动作识别的教程-python源码","children":[{"title":"video-classification-master","children":[{"title":"ResNetCRNN_varylength","children":[{"title":"functions.py <span style='color:#111;'> 11.86KB </span>","children":null,"spread":false},{"title":"ResNetCRNN_check_prediction.py <span style='color:#111;'> 3.64KB </span>","children":null,"spread":false},{"title":"UCF101_frame_count.pkl <span style='color:#111;'> 427.79KB </span>","children":null,"spread":false},{"title":"UCF101_ResNetCRNN_varlen.py <span style='color:#111;'> 10.39KB </span>","children":null,"spread":false},{"title":"UCF101_tvflow_v_frame_count.pkl <span style='color:#111;'> 418.05KB </span>","children":null,"spread":false},{"title":"results","children":[{"title":"replot_loss.ipynb <span style='color:#111;'> 118.93KB </span>","children":null,"spread":false},{"title":"CRNN_varlen_epoch_test_loss.npy <span style='color:#111;'> 1.15KB </span>","children":null,"spread":false},{"title":"loss_UCF101_CRNN.png <span style='color:#111;'> 581.80KB </span>","children":null,"spread":false},{"title":"CRNN_varlen_epoch_training_loss.npy <span style='color:#111;'> 1.15KB </span>","children":null,"spread":false},{"title":"CRNN_varlen_epoch_training_score.npy <span style='color:#111;'> 1.15KB </span>","children":null,"spread":false},{"title":"CRNN_varlen_epoch_test_score.npy <span style='color:#111;'> 1.15KB </span>","children":null,"spread":false}],"spread":true},{"title":"UCF101_tvflow_u_frame_count.pkl <span style='color:#111;'> 418.05KB </span>","children":null,"spread":false},{"title":"UCF101actions.pkl <span style='color:#111;'> 1.76KB </span>","children":null,"spread":false},{"title":"check_predictions","children":[{"title":"UCF101_videos_prediction.pkl <span style='color:#111;'> 667.41KB </span>","children":null,"spread":false},{"title":"check_video_predictions.ipynb <span style='color:#111;'> 815.26KB </span>","children":null,"spread":false},{"title":"wrong_predictions.pkl <span style='color:#111;'> 44.68KB </span>","children":null,"spread":false}],"spread":true}],"spread":true},{"title":"Conv3D","children":[{"title":"functions.py <span style='color:#111;'> 15.07KB </span>","children":null,"spread":false},{"title":"Conv3D_check_prediction.py <span style='color:#111;'> 3.38KB </span>","children":null,"spread":false},{"title":"__pycache__","children":[{"title":"load_data.cpython-36.pyc <span style='color:#111;'> 3.16KB </span>","children":null,"spread":false},{"title":"functions.cpython-36.pyc <span style='color:#111;'> 8.68KB </span>","children":null,"spread":false}],"spread":true},{"title":".DS_Store <span style='color:#111;'> 6.00KB </span>","children":null,"spread":false},{"title":"outputs","children":[{"title":"UCF101_videos_prediction.pkl <span style='color:#111;'> 667.52KB </span>","children":null,"spread":false},{"title":"replot_loss.ipynb <span style='color:#111;'> 104.24KB </span>","children":null,"spread":false},{"title":"Conv3D_epoch_training_scores.npy <span style='color:#111;'> 13.13KB </span>","children":null,"spread":false},{"title":"Conv3D_epoch_training_losses.npy <span style='color:#111;'> 13.13KB </span>","children":null,"spread":false},{"title":"loss_UCF101_3DCNN.png <span style='color:#111;'> 518.10KB </span>","children":null,"spread":false},{"title":"Conv3D_epoch_test_score.npy <span style='color:#111;'> 168B </span>","children":null,"spread":false},{"title":"fig_UCF101_3DCNN.png <span style='color:#111;'> 534.50KB </span>","children":null,"spread":false},{"title":".ipynb_checkpoints","children":[{"title":"replot_loss-checkpoint.ipynb <span style='color:#111;'> 104.24KB </span>","children":null,"spread":false}],"spread":false},{"title":"Conv3D_epoch_test_loss.npy <span style='color:#111;'> 168B </span>","children":null,"spread":false}],"spread":true},{"title":"UCF101_3DCNN.py <span style='color:#111;'> 7.74KB </span>","children":null,"spread":false},{"title":"UCF101actions.pkl <span style='color:#111;'> 1.76KB </span>","children":null,"spread":false},{"title":"check_predictions","children":[{"title":"UCF101_videos_prediction.pkl <span style='color:#111;'> 667.52KB </span>","children":null,"spread":false},{"title":"check_video_predictions.ipynb <span style='color:#111;'> 851.05KB </span>","children":null,"spread":false},{"title":"wrong_predictions.pkl <span style='color:#111;'> 86.56KB </span>","children":null,"spread":false},{"title":".DS_Store <span style='color:#111;'> 6.00KB </span>","children":null,"spread":false},{"title":".ipynb_checkpoints","children":[{"title":"check_video_predictions-checkpoint.ipynb <span style='color:#111;'> 851.05KB </span>","children":null,"spread":false}],"spread":false}],"spread":true}],"spread":true},{"title":"CRNN","children":[{"title":"functions.py <span style='color:#111;'> 15.07KB </span>","children":null,"spread":false},{"title":"__pycache__","children":[{"title":"load_data.cpython-36.pyc <span style='color:#111;'> 3.13KB </span>","children":null,"spread":false},{"title":"functions.cpython-36.pyc <span style='color:#111;'> 8.68KB </span>","children":null,"spread":false}],"spread":true},{"title":".DS_Store <span style='color:#111;'> 6.00KB </span>","children":null,"spread":false},{"title":"outputs","children":[{"title":"CRNN_epoch_training_losses.npy <span style='color:#111;'> 75.57KB </span>","children":null,"spread":false},{"title":"CRNN_epoch_training_scores.npy <span style='color:#111;'> 75.57KB </span>","children":null,"spread":false},{"title":"replot_loss.ipynb <span style='color:#111;'> 137.35KB </span>","children":null,"spread":false},{"title":"loss_UCF101_CRNN.png <span style='color:#111;'> 980.89KB </span>","children":null,"spread":false},{"title":".DS_Store <span style='color:#111;'> 6.00KB </span>","children":null,"spread":false},{"title":"CRNN_epoch_test_score.npy <span style='color:#111;'> 360B </span>","children":null,"spread":false},{"title":"CRNN_epoch_test_loss.npy <span style='color:#111;'> 360B </span>","children":null,"spread":false}],"spread":true},{"title":"UCF101_CRNN.py <span style='color:#111;'> 8.52KB </span>","children":null,"spread":false},{"title":"CRNN_check_prediction.py <span style='color:#111;'> 3.73KB </span>","children":null,"spread":false},{"title":"UCF101actions.pkl <span style='color:#111;'> 1.76KB </span>","children":null,"spread":false},{"title":"check_predictions","children":[{"title":"check_video_predictions.ipynb <span style='color:#111;'> 815.26KB </span>","children":null,"spread":false},{"title":".DS_Store <span style='color:#111;'> 6.00KB </span>","children":null,"spread":false}],"spread":true}],"spread":true},{"title":".DS_Store <span style='color:#111;'> 8.00KB </span>","children":null,"spread":false},{"title":"ResNetCRNN","children":[{"title":"functions.py <span style='color:#111;'> 15.07KB </span>","children":null,"spread":false},{"title":"ResNetCRNN_check_prediction.py <span style='color:#111;'> 3.70KB </span>","children":null,"spread":false},{"title":"__pycache__","children":[{"title":"load_data.cpython-36.pyc <span style='color:#111;'> 3.13KB </span>","children":null,"spread":false},{"title":"functions.cpython-36.pyc <span style='color:#111;'> 8.68KB </span>","children":null,"spread":false}],"spread":true},{"title":".DS_Store <span style='color:#111;'> 6.00KB </span>","children":null,"spread":false},{"title":"outputs","children":[{"title":"CRNN_epoch_training_losses.npy <span style='color:#111;'> 88.02KB </span>","children":null,"spread":false},{"title":"CRNN_epoch_training_scores.npy <span style='color:#111;'> 88.02KB </span>","children":null,"spread":false},{"title":"replot_loss.ipynb <span style='color:#111;'> 165.31KB </span>","children":null,"spread":false},{"title":"loss_UCF101_ResNetCRNN.png <span style='color:#111;'> 777.96KB </span>","children":null,"spread":false},{"title":".DS_Store <span style='color:#111;'> 6.00KB </span>","children":null,"spread":false},{"title":"CRNN_epoch_test_score.npy <span style='color:#111;'> 488B </span>","children":null,"spread":false},{"title":".ipynb_checkpoints","children":[{"title":"replot_loss-checkpoint.ipynb <span style='color:#111;'> 153.78KB </span>","children":null,"spread":false}],"spread":false},{"title":"CRNN_epoch_test_loss.npy <span style='color:#111;'> 488B </span>","children":null,"spread":false}],"spread":true},{"title":"UCF101_ResNetCRNN.py <span style='color:#111;'> 9.16KB </span>","children":null,"spread":false},{"title":"UCF101actions.pkl <span style='color:#111;'> 1.76KB </span>","children":null,"spread":false},{"title":"check_predictions","children":[{"title":"UCF101_videos_prediction.pkl <span style='color:#111;'> 667.41KB </span>","children":null,"spread":false},{"title":"check_video_predictions.ipynb <span style='color:#111;'> 815.26KB </span>","children":null,"spread":false},{"title":"wrong_predictions.pkl <span style='color:#111;'> 44.68KB </span>","children":null,"spread":false},{"title":".ipynb_checkpoints","children":[{"title":"check_video_predictions-checkpoint.ipynb <span style='color:#111;'> 815.26KB </span>","children":null,"spread":false}],"spread":false}],"spread":true}],"spread":true},{"title":"README.md <span style='color:#111;'> 6.68KB </span>","children":null,"spread":false},{"title":"fig","children":[{"title":"loss_3DCNN.png <span style='color:#111;'> 867.34KB </span>","children":null,"spread":false},{"title":"wrong_pred.png <span style='color:#111;'> 123.97KB </span>","children":null,"spread":false},{"title":"loss_ResNetCRNN.png <span style='color:#111;'> 1.04MB </span>","children":null,"spread":false},{"title":"f_CNN.png <span style='color:#111;'> 26.76KB </span>","children":null,"spread":false},{"title":".DS_Store <span style='color:#111;'> 6.00KB </span>","children":null,"spread":false},{"title":"kayaking.gif <span style='color:#111;'> 2.22MB </span>","children":null,"spread":false},{"title":"loss_CRNN.png <span style='color:#111;'> 980.89KB </span>","children":null,"spread":false},{"title":"CRNN.png <span style='color:#111;'> 647.23KB </span>","children":null,"spread":false}],"spread":true}],"spread":true}],"spread":true}]

评论信息

免责申明

【只为小站】的资源来自网友分享,仅供学习研究,请务必在下载后24小时内给予删除,不得用于其他任何用途,否则后果自负。基于互联网的特殊性,【只为小站】 无法对用户传输的作品、信息、内容的权属或合法性、合规性、真实性、科学性、完整权、有效性等进行实质审查;无论 【只为小站】 经营者是否已进行审查,用户均应自行承担因其传输的作品、信息、内容而可能或已经产生的侵权或权属纠纷等法律责任。
本站所有资源不代表本站的观点或立场,基于网友分享,根据中国法律《信息网络传播权保护条例》第二十二条之规定,若资源存在侵权或相关问题请联系本站客服人员,zhiweidada#qq.com,请把#换成@,本站将给予最大的支持与配合,做到及时反馈和处理。关于更多版权及免责申明参见 版权及免责申明