Microgrid_sim 顾问: 和 问题定义 我们将要建立一个关于小型电网的模型,称为“微电网”。 目标是满足微电网的(本地)需求、优化分配、公平要求、可靠和可扩展。 能源资源管理中的(随机)模型 潜在的参考论文: 博尔盖蒂、阿尔贝托等人。 “分布式能源的两阶段调度程序。” 电力技术,2007 年 IEEE 洛桑。 IEEE, 2007. 李,GC,等。 “为区域能源系统管理开发以温室气体减排为导向的不精确动态模型。” 能源 36.5 (2011):3388-3398。 Wong、Steven 和 J. David Fuller。 “在替代电力市场中使用随机优化对能源和储备进行定价。” 电力系统,IEEE Transactions on 22.2 (2007): 631-638。 Wallace、Stein W. 和 Stein-Erik Fleten。 “能源中的随机规划模型
1
Stochastic Optimization by John C. Duchi
2021-04-08 13:10:00 1.11MB 机器学习 优化 统计
1
简单清楚的介绍了机器学习中的优化算法,强烈推荐
2021-03-19 15:04:43 6.24MB optimization machine learning
1
ADAM:A METHOD FOR STOCHASTIC OPTIMIZATION.zip
2021-03-12 09:13:53 496KB 深度学习
1
Unit Tests for Stochastic Optimization.pdf
2021-03-12 09:13:53 8.26MB 深度学习
1
Stochastic Optimization Authors: Johannes Josef SchneiderScott Kirkpatrick The search for optimal solutions pervades our daily lives. From the scientific point of view, optimization procedures play an eminent role whenever exact solutions to a given problem are not at hand or a compromise has to be sought, e.g. to obtain a sufficiently accurate solution within a given amount of time. This book addresses stochastic optimization procedures in a broad manner, giving an overview of the most relevant optimization philosophies in the first part. The second part deals with benchmark problems in depth, by applying in sequence a selection of optimization procedures to them. While having primarily scientists and students from the physical and engineering sciences in mind, this book addresses the larger community of all those wishing to learn about stochastic optimization techniques and how to use them.
2019-12-21 18:57:33 41.15MB 随机优化
1
深度学习ADAM算法,分享给大家学习。 We introduce Adam, an algorithm for first-order gradient-based optimization of stochastic objective functions, based on adaptive estimates of lower-order moments. The method is straightforward to implement, is computationally efficient, has little memory requirements, is invariant to diagonal rescaling of the gradients, and is well suited for problems that are large in terms of data and/or parameters. The method is also appropriate for non-stationary objectives and problems with very noisy and/or sparse gradients. The hyper-parameters have intuitive interpretations and typically require little tuning. Some connections to related algorithms, on which Adam was inspired, are discussed. We also analyze the theoretical convergence properties of the algorithm and provide a regret bound on the convergence rate that is comparable to the best known results under the online convex optimization framework. Empirical results demonstrate that Adam works well in practice and compares favorably to other stochastic optimization methods. Finally, we discuss AdaMax, a variant of Adam based on the infinity norm.
2019-12-21 18:52:51 571KB Adam算法
1