首页 行业 互联网     /    MATLAB工具箱大全-马尔可夫决策过程(MDP)工具箱MDPtoolbox

MATLAB工具箱大全-马尔可夫决策过程(MDP)工具箱MDPtoolbox

上传者: wenyusuran | 上传时间:2023/8/27 4:23:40 | 文件大小:398KB | 文件类型:ZIP
MATLAB工具箱大全-马尔可夫决策过程(MDP)工具箱MDPtoolbox
MATLAB工具箱大全-马尔可夫决策过程(MDP)工具箱MDPtoolbox 本软件ID:15151404

文件下载

资源详情

[{"title":"(55个子文件398KB)MATLAB工具箱大全-马尔可夫决策过程(MDP)工具箱MDPtoolbox","children":[{"title":"MATLAB工具箱大全-马尔可夫决策过程(MDP)工具箱MDPtoolbox","children":[{"title":"MDPtoolbox","children":[{"title":"mdp_check_square_stochastic.m <span style='color:#111;'>2.21KB</span>","children":null,"spread":false},{"title":"mdp_computePR.m <span style='color:#111;'>2.82KB</span>","children":null,"spread":false},{"title":"mdp_eval_policy_iterative.m <span style='color:#111;'>5.37KB</span>","children":null,"spread":false},{"title":"AUTHORS <span style='color:#111;'>63B</span>","children":null,"spread":false},{"title":"COPYING <span style='color:#111;'>1.53KB</span>","children":null,"spread":false},{"title":"mdp_Q_learning.m <span style='color:#111;'>5.27KB</span>","children":null,"spread":false},{"title":"mdp_policy_iteration_modified.m <span style='color:#111;'>5.43KB</span>","children":null,"spread":false},{"title":"mdp_example_forest.m <span style='color:#111;'>4.62KB</span>","children":null,"spread":false},{"title":"mdp_silent.m <span style='color:#111;'>1.70KB</span>","children":null,"spread":false},{"title":"mdp_computePpolicyPRpolicy.m <span style='color:#111;'>2.86KB</span>","children":null,"spread":false},{"title":"mdp_finite_horizon.m <span style='color:#111;'>4.00KB</span>","children":null,"spread":false},{"title":"mdp_eval_policy_matrix.m <span style='color:#111;'>3.24KB</span>","children":null,"spread":false},{"title":"mdp_check.m <span style='color:#111;'>3.95KB</span>","children":null,"spread":false},{"title":"mdp_value_iteration_bound_iter.m <span style='color:#111;'>4.66KB</span>","children":null,"spread":false},{"title":"mdp_policy_iteration.m <span style='color:#111;'>5.12KB</span>","children":null,"spread":false},{"title":"mdp_span.m <span style='color:#111;'>1.67KB</span>","children":null,"spread":false},{"title":"mdp_eval_policy_optimality.m <span style='color:#111;'>3.84KB</span>","children":null,"spread":false},{"title":"mdp_relative_value_iteration.m <span style='color:#111;'>4.76KB</span>","children":null,"spread":false},{"title":"mdp_bellman_operator.m <span style='color:#111;'>3.22KB</span>","children":null,"spread":false},{"title":"mdp_value_iterationGS.m <span style='color:#111;'>6.89KB</span>","children":null,"spread":false},{"title":"README <span style='color:#111;'>3.50KB</span>","children":null,"spread":false},{"title":"mdp_LP.m <span style='color:#111;'>3.75KB</span>","children":null,"spread":false},{"title":"mdp_value_iteration.m <span style='color:#111;'>6.21KB</span>","children":null,"spread":false},{"title":"mdp_verbose.m <span style='color:#111;'>1.71KB</span>","children":null,"spread":false},{"title":"documentation","children":[{"title":"mdp_bellman_operator.html <span style='color:#111;'>3.11KB</span>","children":null,"spread":false},{"title":"mdp_check.html <span style='color:#111;'>2.89KB</span>","children":null,"spread":false},{"title":"mdp_LP.html <span style='color:#111;'>3.17KB</span>","children":null,"spread":false},{"title":"DOCUMENTATION.html <span style='color:#111;'>3.72KB</span>","children":null,"spread":false},{"title":"mdp_eval_policy_optimality.html <span style='color:#111;'>3.60KB</span>","children":null,"spread":false},{"title":"QuickStart.pdf <span style='color:#111;'>302.72KB</span>","children":null,"spread":false},{"title":"mdp_eval_policy_iterative.html <span style='color:#111;'>7.33KB</span>","children":null,"spread":false},{"title":"mdp_policy_iteration_modified.html <span style='color:#111;'>4.85KB</span>","children":null,"spread":false},{"title":"mdp_relative_value_iteration.html <span style='color:#111;'>7.50KB</span>","children":null,"spread":false},{"title":"mdp_computePpolicyPRpolicy.html <span style='color:#111;'>3.28KB</span>","children":null,"spread":false},{"title":"index_alphabetic.html <span style='color:#111;'>6.32KB</span>","children":null,"spread":false},{"title":"mdp_example_forest.html <span style='color:#111;'>6.67KB</span>","children":null,"spread":false},{"title":"index_category.html <span style='color:#111;'>6.86KB</span>","children":null,"spread":false},{"title":"meandiscrepancy.jpg <span style='color:#111;'>15.90KB</span>","children":null,"spread":false},{"title":"mdp_eval_policy_TD_0.html <span style='color:#111;'>3.28KB</span>","children":null,"spread":false},{"title":"mdp_computePR.html <span style='color:#111;'>2.82KB</span>","children":null,"spread":false},{"title":"mdp_check_square_stochastic.html <span style='color:#111;'>2.41KB</span>","children":null,"spread":false},{"title":"mdp_example_rand.html <span style='color:#111;'>3.74KB</span>","children":null,"spread":false},{"title":"mdp_value_iterationGS.html <span style='color:#111;'>8.34KB</span>","children":null,"spread":false},{"title":"mdp_value_iteration.html <span style='color:#111;'>6.41KB</span>","children":null,"spread":false},{"title":"mdp_finite_horizon.html <span style='color:#111;'>4.06KB</span>","children":null,"spread":false},{"title":"arrow.gif <span style='color:#111;'>231B</span>","children":null,"spread":false},{"title":"mdp_value_iteration_bound_iter.html <span style='color:#111;'>3.50KB</span>","children":null,"spread":false},{"title":"mdp_verbose_silent.html <span style='color:#111;'>2.39KB</span>","children":null,"spread":false},{"title":"mdp_Q_learning.html <span style='color:#111;'>4.09KB</span>","children":null,"spread":false},{"title":"mdp_span.html <span style='color:#111;'>2.03KB</span>","children":null,"spread":false},{"title":"mdp_policy_iteration.html <span style='color:#111;'>4.82KB</span>","children":null,"spread":false},{"title":"mdp_eval_policy_matrix.html <span style='color:#111;'>2.91KB</span>","children":null,"spread":false}],"spread":false},{"title":"mdp_example_rand.m <span style='color:#111;'>3.75KB</span>","children":null,"spread":false},{"title":"mdp_eval_policy_TD_0.m <span style='color:#111;'>5.28KB</span>","children":null,"spread":false}],"spread":false},{"title":"license.txt <span style='color:#111;'>1.27KB</span>","children":null,"spread":false}],"spread":true}],"spread":true}]

评论信息

  • Tracer-Tang:
    用户下载后在一定时间内未进行评价,系统默认好评。2021-07-07

免责申明

【好快吧下载】的资源来自网友分享,仅供学习研究,请务必在下载后24小时内给予删除,不得用于其他任何用途,否则后果自负。基于互联网的特殊性,【好快吧下载】 无法对用户传输的作品、信息、内容的权属或合法性、合规性、真实性、科学性、完整权、有效性等进行实质审查;无论 【好快吧下载】 经营者是否已进行审查,用户均应自行承担因其传输的作品、信息、内容而可能或已经产生的侵权或权属纠纷等法律责任。
本站所有资源不代表本站的观点或立场,基于网友分享,根据中国法律《信息网络传播权保护条例》第二十二条之规定,若资源存在侵权或相关问题请联系本站客服人员,8686821#qq.com,请把#换成@,本站将给予最大的支持与配合,做到及时反馈和处理。关于更多版权及免责申明参见 版权及免责申明