MDPtoolbox(MDP工具箱)

上传者: 43509834 | 上传时间: 2021-02-03 12:11:51 | 文件大小: 393KB | 文件类型: ZIP
MATLAB的MDP工具箱(马尔可夫决策过程工具箱),内含MATLAB代码以及代码说明。(注:本资源是截止至2021年2月的最新工具箱)。其中代码说明在documentation文件夹下,文件夹内为网页链接,双击打开网页链接后即为代码说明,代码说明中包括example。

文件下载

资源详情

[{"title":"( 55 个子文件 393KB ) MDPtoolbox(MDP工具箱)","children":[{"title":"license.txt <span style='color:#111;'> 1.27KB </span>","children":null,"spread":false},{"title":"MDPtoolbox","children":[{"title":"mdp_relative_value_iteration.m <span style='color:#111;'> 4.76KB </span>","children":null,"spread":false},{"title":"mdp_bellman_operator.m <span style='color:#111;'> 3.22KB </span>","children":null,"spread":false},{"title":"documentation","children":[{"title":"mdp_policy_iteration_modified.html <span style='color:#111;'> 4.85KB </span>","children":null,"spread":false},{"title":"mdp_computePpolicyPRpolicy.html <span style='color:#111;'> 3.28KB </span>","children":null,"spread":false},{"title":"mdp_value_iterationGS.html <span style='color:#111;'> 8.34KB </span>","children":null,"spread":false},{"title":"mdp_eval_policy_iterative.html <span style='color:#111;'> 7.33KB </span>","children":null,"spread":false},{"title":"mdp_eval_policy_matrix.html <span style='color:#111;'> 2.91KB </span>","children":null,"spread":false},{"title":"QuickStart.pdf <span style='color:#111;'> 302.72KB </span>","children":null,"spread":false},{"title":"mdp_verbose_silent.html <span style='color:#111;'> 2.39KB </span>","children":null,"spread":false},{"title":"mdp_example_rand.html <span style='color:#111;'> 3.74KB </span>","children":null,"spread":false},{"title":"mdp_relative_value_iteration.html <span style='color:#111;'> 7.50KB </span>","children":null,"spread":false},{"title":"mdp_finite_horizon.html <span style='color:#111;'> 4.06KB </span>","children":null,"spread":false},{"title":"arrow.gif <span style='color:#111;'> 231B </span>","children":null,"spread":false},{"title":"mdp_policy_iteration.html <span style='color:#111;'> 4.82KB </span>","children":null,"spread":false},{"title":"mdp_computePR.html <span style='color:#111;'> 2.82KB </span>","children":null,"spread":false},{"title":"mdp_check.html <span style='color:#111;'> 2.89KB </span>","children":null,"spread":false},{"title":"mdp_check_square_stochastic.html <span style='color:#111;'> 2.41KB </span>","children":null,"spread":false},{"title":"mdp_value_iteration_bound_iter.html <span style='color:#111;'> 3.50KB </span>","children":null,"spread":false},{"title":"index_alphabetic.html <span style='color:#111;'> 6.32KB </span>","children":null,"spread":false},{"title":"meandiscrepancy.jpg <span style='color:#111;'> 15.90KB </span>","children":null,"spread":false},{"title":"mdp_Q_learning.html <span style='color:#111;'> 4.09KB </span>","children":null,"spread":false},{"title":"mdp_span.html <span style='color:#111;'> 2.03KB </span>","children":null,"spread":false},{"title":"index_category.html <span style='color:#111;'> 6.86KB </span>","children":null,"spread":false},{"title":"mdp_value_iteration.html <span style='color:#111;'> 6.41KB </span>","children":null,"spread":false},{"title":"mdp_example_forest.html <span style='color:#111;'> 6.67KB </span>","children":null,"spread":false},{"title":"mdp_bellman_operator.html <span style='color:#111;'> 3.11KB </span>","children":null,"spread":false},{"title":"mdp_eval_policy_TD_0.html <span style='color:#111;'> 3.28KB </span>","children":null,"spread":false},{"title":"mdp_LP.html <span style='color:#111;'> 3.17KB </span>","children":null,"spread":false},{"title":"mdp_eval_policy_optimality.html <span style='color:#111;'> 3.60KB </span>","children":null,"spread":false},{"title":"DOCUMENTATION.html <span style='color:#111;'> 3.72KB </span>","children":null,"spread":false}],"spread":false},{"title":"mdp_silent.m <span style='color:#111;'> 1.70KB </span>","children":null,"spread":false},{"title":"mdp_Q_learning.m <span style='color:#111;'> 5.27KB </span>","children":null,"spread":false},{"title":"mdp_computePpolicyPRpolicy.m <span style='color:#111;'> 2.86KB </span>","children":null,"spread":false},{"title":"mdp_value_iteration_bound_iter.m <span style='color:#111;'> 4.66KB </span>","children":null,"spread":false},{"title":"mdp_computePR.m <span style='color:#111;'> 2.82KB </span>","children":null,"spread":false},{"title":"mdp_policy_iteration.m <span style='color:#111;'> 5.12KB </span>","children":null,"spread":false},{"title":"COPYING <span style='color:#111;'> 1.53KB </span>","children":null,"spread":false},{"title":"mdp_eval_policy_iterative.m <span style='color:#111;'> 5.37KB </span>","children":null,"spread":false},{"title":"mdp_span.m <span style='color:#111;'> 1.67KB </span>","children":null,"spread":false},{"title":"mdp_eval_policy_TD_0.m <span style='color:#111;'> 5.28KB </span>","children":null,"spread":false},{"title":"mdp_example_rand.m <span style='color:#111;'> 3.75KB </span>","children":null,"spread":false},{"title":"mdp_value_iterationGS.m <span style='color:#111;'> 6.89KB </span>","children":null,"spread":false},{"title":"mdp_value_iteration.m <span style='color:#111;'> 6.21KB </span>","children":null,"spread":false},{"title":"mdp_policy_iteration_modified.m <span style='color:#111;'> 5.43KB </span>","children":null,"spread":false},{"title":"mdp_check.m <span style='color:#111;'> 3.95KB </span>","children":null,"spread":false},{"title":"mdp_check_square_stochastic.m <span style='color:#111;'> 2.21KB </span>","children":null,"spread":false},{"title":"mdp_LP.m <span style='color:#111;'> 3.75KB </span>","children":null,"spread":false},{"title":"mdp_finite_horizon.m <span style='color:#111;'> 4.00KB </span>","children":null,"spread":false},{"title":"mdp_eval_policy_matrix.m <span style='color:#111;'> 3.24KB </span>","children":null,"spread":false},{"title":"README <span style='color:#111;'> 3.50KB </span>","children":null,"spread":false},{"title":"AUTHORS <span style='color:#111;'> 63B </span>","children":null,"spread":false},{"title":"mdp_eval_policy_optimality.m <span style='color:#111;'> 3.84KB </span>","children":null,"spread":false},{"title":"mdp_verbose.m <span style='color:#111;'> 1.71KB </span>","children":null,"spread":false},{"title":"mdp_example_forest.m <span style='color:#111;'> 4.62KB </span>","children":null,"spread":false}],"spread":false}],"spread":true}]

评论信息

免责申明

【只为小站】的资源来自网友分享,仅供学习研究,请务必在下载后24小时内给予删除,不得用于其他任何用途,否则后果自负。基于互联网的特殊性,【只为小站】 无法对用户传输的作品、信息、内容的权属或合法性、合规性、真实性、科学性、完整权、有效性等进行实质审查;无论 【只为小站】 经营者是否已进行审查,用户均应自行承担因其传输的作品、信息、内容而可能或已经产生的侵权或权属纠纷等法律责任。
本站所有资源不代表本站的观点或立场,基于网友分享,根据中国法律《信息网络传播权保护条例》第二十二条之规定,若资源存在侵权或相关问题请联系本站客服人员,zhiweidada#qq.com,请把#换成@,本站将给予最大的支持与配合,做到及时反馈和处理。关于更多版权及免责申明参见 版权及免责申明